ORCID Profile
0000-0001-5491-3460
Current Organisations
Hue University
,
University of Queensland
,
Princess Alexandra Hospital
Does something not look right? The information on this page has been harvested from data sources that may not be up to date. We continue to work with information providers to improve coverage and quality. To report an issue, use the Feedback Form.
Publisher: AMPCo
Date: 08-2012
DOI: 10.5694/MJA11.11329
Abstract: The publication of the Australasian Creatinine Consensus Working Group's position statements in 2005 and 2007 resulted in automatic reporting of estimated glomerular filtration rate (eGFR) with requests for serum creatinine concentration in adults, facilitated the unification of units of measurement for creatinine and eGFR, and promoted the standardisation of assays. New advancements and continuing debate led the Australasian Creatinine Consensus Working Group to reconvene in 2010. The working group recommends that the method of calculating eGFR should be changed to the Chronic Kidney Disease Epidemiology Collaboration (CKD-EPI) formula, and that all laboratories should report eGFR values as a precise figure to at least 90 mL/min/1.73 m(2). Age-related decision points for eGFR in adults are not recommended, as although an eGFR < 60 mL/min/1.73 m(2) is very common in older people, it is nevertheless predictive of significantly increased risks of adverse clinical outcomes, and should not be considered a normal part of ageing.If using eGFR for drug dosing, body size should be considered, in addition to referring to the approved product information. For drugs with a narrow therapeutic index, therapeutic drug monitoring or a valid marker of drug effect should be used to in idualise dosing. The CKD-EPI formula has been validated as a tool to estimate GFR in some populations of non-European ancestry living in Western countries. Pending publication of validation studies, the working group also recommends that Australasian laboratories continue to automatically report eGFR in Aboriginal and Torres Strait Islander peoples. The working group concluded that routine calculation of eGFR is not recommended in children and youth, or in pregnant women. Serum creatinine concentration (preferably using an enzymatic assay for paediatric patients) should remain as the standard test for kidney function in these populations.
Publisher: Springer Science and Business Media LLC
Date: 29-07-2020
Publisher: Wiley
Date: 13-09-2021
DOI: 10.1111/NEP.13775
Publisher: Elsevier BV
Date: 02-2003
Publisher: Elsevier BV
Date: 07-2018
DOI: 10.1053/J.AJKD.2017.12.018
Abstract: Fatigue is one of the most highly prioritized outcomes for patients and clinicians, but remains infrequently and inconsistently reported across trials in hemodialysis. We convened an international Standardized Outcomes in Nephrology-Hemodialysis (SONG-HD) consensus workshop with stakeholders to discuss the development and implementation of a core outcome measure for fatigue. 15 patients/caregivers and 42 health professionals (clinicians, researchers, policy makers, and industry representatives) from 9 countries participated in breakout discussions. Transcripts were analyzed thematically. 4 themes for a core outcome measure emerged. Drawing attention to a distinct and all-encompassing symptom was explicitly recognizing fatigue as a multifaceted symptom unique to hemodialysis. Emphasizing the pervasive impact of fatigue on life participation justified the focus on how fatigue severely impaired the patient's ability to do usual activities. Ensuring relevance and accuracy in measuring fatigue would facilitate shared decision making about treatment. Minimizing burden of administration meant avoiding the cognitive burden, additional time, and resources required to use the measure. A core outcome measure that is simple, is short, and includes a focus on the severity of the impact of fatigue on life participation may facilitate consistent and meaningful measurement of fatigue in all trials to inform decision making and care of patients receiving hemodialysis.
Publisher: SAGE Publications
Date: 28-02-2023
DOI: 10.1177/08968608221079183
Abstract: Pre-training peritonitis (PTP), defined as peritonitis that occurred after catheter insertion and before peritoneal dialysis (PD) training, is increasingly recognized as a risk factor for adverse patient outcomes, yet poorly understood with limited studies conducted to date. This study was conducted to identify the associations, microbiologic profiles and outcomes of PTP compared to post-training peritonitis. This single-centre, case-control study involved patients with kidney failure who had PD as their first kidney replacement therapy and had experienced an episode of PD peritonitis between 1 January 2005 and 31 December 2015. In iduals experiencing their first episode of peritonitis were included in the study and categorized according to whether it occurred pre- or post-training. The primary outcome was peritonitis cure rates and composite outcome of hemodialysis (HD) transfer for ≥30 days or death. The secondary outcomes included catheter removal and refractory peritonitis rates. Among 683 patients who received PD for the first time, 121 (17.7%) had PTP while 265 (38.8%) had post-training peritonitis. PTP patients were more likely to have had exit-site infection (ESI) prior to peritonitis (24.8% compared to 17% in the post-training peritonitis group, p = 0.2). Culture-negative peritonitis was significantly more common in the PTP patients (53.7%) than in the post-training group (27.3%, p 0.001). The cure was achieved in 68.9% of cases and was not significantly different between the PTP and post-training peritonitis groups (66.1% vs. 70.2% OR 0.83, 95% CI 0.51–1.35). Lower odds of cure were associated with peritonitis caused by moderate and high severity organisms (OR 0.49, 95% CI 0.29–0.85 OR 0.18, 95% CI 0.08–0.43, respectively). Composite outcome of HD transfer or death was more commonly observed among patients with PTP (87.5% vs. 75.8% OR 2.2, 95% CI 1.20–4.48) in whom significantly shorter median time to HD transfer was observed (PTP 10.7 months vs. post-training peritonitis 21.9 months, p 0.0001). PTP is a common condition that is highly associated with preceding ESI, is frequently culture-negative and is associated with worse composite outcome of HD transfer or death. PTP rates should be routinely monitored and reported by PD units for continuous quality improvement.
Publisher: Elsevier BV
Date: 2015
DOI: 10.1053/J.AJKD.2014.06.020
Abstract: Erythropoiesis-stimulating agent (ESA)-hyporesponsive anemia is common in chronic kidney disease (CKD). Pentoxifylline shows promise as a treatment for ESA-hyporesponsive anemia, but has not been rigorously evaluated. Multicenter, double-blind, randomized, controlled trial. 53 adult patients with CKD stage 4 or 5 (including dialysis) and ESA-hyporesponsive anemia (hemoglobin≤120g/L and ESA resistance index [calculated as weight-adjusted weekly ESA dose in IU/kg/wk ided by hemoglobin concentration in g/L]≥1.0IU/kg/wk/g/L for erythropoietin-treated patients and ≥0.005μg/kg/wk/g/L for darbepoetin-treated patients). Pentoxifylline (400mg/d n=26) or matching placebo (control n=27) for 4 months. ESA resistance index at 4 months secondary outcomes: hemoglobin concentration, ESA dose, blood transfusion requirement, serum ferritin level and transferrin saturation, C-reactive protein level, adverse events, quality of life, and health economics. There was no statistically significant difference in ESA resistance index between the pentoxifylline and control groups (adjusted mean difference, -0.39 [95%CI, -0.89 to 0.10] IU/kg/wk/g/L P=0.1). Pentoxifylline significantly increased hemoglobin concentration relative to the control group (adjusted mean difference, 7.6 [95%CI, 1.7-13.5] g/L P=0.01). There was no difference in ESA dose between groups (-20.8 [95%CI, -67.2 to 25.7] IU/kg/wk P=0.4). No differences in blood transfusion requirements, adverse events, or quality of life were observed between groups. Pentoxifylline cost A$88.05 (US $82.94) per person over the trial and produced mean savings in ESA cost of A$1,332 (US $1,255). The overall economic impact over the trial period was a saving of A$1,244 (US $1,172) per person for the pentoxifylline group compared with controls. S le size smaller than planned due to slow recruitment. Pentoxifylline did not significantly modify ESA hyporesponsiveness, but increased hemoglobin concentration. Further studies are warranted to determine whether pentoxifylline therapy represents a safe strategy for increasing hemoglobin levels in patients with CKD with ESA-hyporesponsive anemia.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 12-2021
DOI: 10.2215/CJN.08360621
Abstract: Dietary potassium restriction in people receiving maintenance hemodialysis is standard practice and is recommended in guidelines, despite a lack of evidence. We aimed to assess the association between dietary potassium intake and mortality and whether hyperkalemia mediates this association. A total of 8043 adults undergoing maintenance hemodialysis in Europe and South America were included in the DIETary intake, death and hospitalization in adults with end-stage kidney disease treated with HemoDialysis (DIET-HD) study. We measured baseline potassium intake from the Global Allergy and Asthma European Network food frequency questionnaire and performed time-to-event and mediation analyses. The median potassium intake at baseline was 3.5 (interquartile range, 2.5–5.0) g/d. During a median follow-up of 4.0 years (25,890 person-years), we observed 2921 (36%) deaths. After adjusting for baseline characteristics, including cardiac disease and food groups, dietary potassium intake was not associated with all-cause mortality (per 1 g/d higher dietary potassium intake: hazard ratio, 1.00 95% confidence interval [95% CI], 0.95 to 1.05). A mediation analysis showed no association of potassium intake with mortality, either through or independent of serum potassium (hazard ratio, 1.00 95% CI, 1.00 to 1.00 and hazard ratio, 1.01 95% CI, 0.96 to 1.06, respectively). Potassium intake was not significantly associated with serum levels (0.03 95% CI, −0.01 to 0.07 mEq/L per 1 g/d higher dietary potassium intake) or the prevalence of hyperkalemia (≥6.0 mEq/L) at baseline (odds ratio, 1.11 95% CI, 0.89 to 1.37 per 1 g/d higher dietary potassium intake). Hyperkalemia was associated with cardiovascular death (hazard ratio, 1.23 95% CI, 1.03 to 1.48). Higher dietary intake of potassium is not associated with hyperkalemia or death in patients treated with hemodialysis.
Publisher: Elsevier BV
Date: 2002
Abstract: The pseudo-Pelger-Huet (PH) anomaly has been associated with a variety of primary haematological disorders, infections and drugs. Recently, the development of dysgranulopoiesis characterised by a pseudo-PH anomaly has been reported in two patients with the use of mycophenolate mofetil (MMF) in the setting of heart and/or lung transplantation. We present a further five cases of MMF-related dysgranulopoiesis characterised by a pseudo-PH anomaly occurring after renal transplantation. All patients were receiving standard immunosuppression protocols for renal transplantation, including a combination of MMF, steroids and either cyclosporin or tacrolimus. Oral ganciclovir was also used for cytomegalovirus prophylaxis in each case. Development of dysplastic granulopoiesis occurred a median of 96 days (range 66-196 days) after transplantation. Moderate or severe neutropaenia (<1.0 x 10(9)/l) developed in three cases, and appeared to be directly correlated with the percentage of circulating neutrophils present with dysplastic morphology. Resolution of dysgranulopoiesis occurred in all cases only after dose reduction and/ or cessation of both MMF and ganciclovir. In our series, the observed dysplastic granulopoiesis appeared related to the combination of MMF and ganciclovir, rather than MMF alone. Further study is required to determine the exact incidence and pathogenesis of this pattern of bone marrow toxicity.
Publisher: Springer Science and Business Media LLC
Date: 22-02-2022
Publisher: SAGE Publications
Date: 31-05-2023
DOI: 10.1177/08968608231175826
Abstract: Incremental peritoneal dialysis (PD), defined as less than Full-dose PD prescription, has several possible merits, including better preservation of residual kidney function (RKF), lower peritoneal glucose exposure and reduced risk of peritonitis. The aims of this study were to analyse the association of Incremental and Full-dose PD strategy with RKF and urine volume (UV) decline in patients commencing PD. Incident PD patients who participated in the balANZ randomised controlled trial (RCT) (2004–2010) and had at least one post-baseline RKF and UV measurement was included in this study. Patients receiving L/week and ≥56 L/week of PD fluid at PD commencement were classified as Incremental and Full-dose PD, respectively. An alternative cut-point of 42 L/week was used in a sensitivity analysis. The primary and secondary outcomes were changes in measured RKF and daily UV, respectively. The study included 154 patients (mean age 57.9 ± 14.1 years, 44% female, 34% diabetic, mean follow-up 19.5 ± 6.6 months). Incremental and Full-dose PD was commenced by 45 (29.2%) and 109 (70.8%) participants, respectively. RKF declined in the Incremental group from 7.9 ± 3.2 mL/min/1.73 m 2 at baseline to 3.2 ± 2.9 mL/min/1.73 m 2 at 24 months ( p 0.001), and in the Full-dose PD group from 7.3 ± 2.7 mL/min/1.73 m 2 at baseline to 3.4 ± 2.8 mL/min/1.73 m 2 at 24 months ( p 0.001). There was no difference in the slope of RKF decline between Incremental and Full-dose PD ( p = 0.78). UV declined from 1.81 ± 0.73 L/day at baseline to 0.64 ± 0.63 L/day at 24 months in the Incremental PD group ( p 0.001) and from 1.38 ± 0.61 L/day to 0.71 ± 0.46 L/day in the Full-dose PD group ( p 0.001). There was no difference in the slope of UV decline between Incremental and Full-dose PD ( p = 0.18). Compared with Full-dose PD start, Incremental PD start is associated with similar declines in RKF and UV.
Publisher: SAGE Publications
Date: 16-02-2021
Abstract: This national survey of barriers to and constraints of acute peritoneal dialysis (aPD) in acute kidney injury (AKI) was performed by distributing an online questionnaire to all medical directors of public dialysis units registered with the Nephrology Society of Thailand during September–November 2019. One hundred and thirteen adult facilities responded to the survey covering 75 from 76 provinces (99%) of Thailand. aPD was performed in 66 centres (58%). In facilities where aPD practice was available, the utilization rate was relatively low ( cases/year) and limited to specific conditions, including HIV seropositive patients, previous receiving dialysis education and plan and difficult vascular access creation. Only 9% of facilities performed aPD routinely, but interestingly all such units permitted bedside catheter insertion by the nephrologists or internists. The major constraints placed on aPD practice were PD catheter insertion competency, timely catheter insertion support and the medical supporting team’s knowledge/competency deficits. aPD for AKI is underutilized in Thailand and limited by the inability to undertake timely PD catheter insertion and knowledge and competency deficits.
Publisher: Massachusetts Medical Society
Date: 12-08-2010
Publisher: Wiley
Date: 29-03-2012
Publisher: S. Karger AG
Date: 2011
DOI: 10.1159/000327146
Abstract: In clinical practice there is considerable variation in the timing of initiation of dialysis. The IDEAL trial (Initiating Dialysis Early and Late study) showed that planned early initiation of dialysis in patients with stage 5 chronic kidney disease (CKD) was not associated with an improvement in clinical outcome, but was associated with increased costs. The predominant dialysis modality worldwide is hemodialysis (HD). This subanalysis of the IDEAL trial examined whether the timing of the initiation of dialysis in those who had chosen HD influenced survival and the occurrence of complications. Patients on the IDEAL trial were older than 18 years and had progressive advanced CKD. They were randomly assigned to commence dialysis at an estimated glomerular filtration rate (eGFR) of 10-14 ml/min (early start) or when the eGFR was 5-7 ml/min (late start). The primary outcome was death from any cause. Between 2000 and 2008, 362 of the 828 patients (43.7%) randomized in the trial planned to commence HD. 322 (88.9%) of these subsequently commenced HD and 17 (4.7%) commenced peritoneal dialysis, with a median time to the initiation of dialysis of 1.63 months in the early-start group and 6.93 months in the late- start group. During a median follow-up time of 3.81 years, 50 of 171 patients in the early-start group (29.2%) and 59 in the late-start group (30.1%) died (hazard ratio with early initiation=0.97: 95% CI: 0.66-1.41 p=0.86). There was no significant difference in the frequency of cardiovascular events, infections, or access-related events, but there was a significantly higher frequency of fluid and electrolyte events in the late-start group (p=0.02). In this subanalysis of the IDEAL trial, patients commencing dialysis early with stage 5 CKD for whom the planned dialysis modality was HD did not have an improvement in survival or any reduction in most clinical outcomes apart from fluid and electrolyte events.
Publisher: SAGE Publications
Date: 11-2015
Abstract: The HONEYPOT study recently reported that daily exit-site application of antibacterial honey was not superior to nasal mupirocin prophylaxis for preventing overall peritoneal dialysis (PD)-related infection. This paper reports a secondary outcome analysis of the HONEYPOT study with respect to exit-site infection (ESI) and peritonitis microbiology, infectious hospitalization and technique failure. A total of 371 PD patients were randomized to daily exit-site application of antibacterial honey plus usual exit-site care ( N = 186) or intranasal mupirocin prophylaxis (in nasal Staphylococcus aureus carriers only) plus usual exit-site care (control, N = 185). Groups were compared on rates of organism-specific ESI and peritonitis, peritonitis-and infection-associated hospitalization, and technique failure (PD withdrawal). The mean peritonitis rates in the honey and control groups were 0.41 (95% confidence interval [CI] 0.32 – 0.50) and 0.41 (95% CI 0.33 – 0.49) episodes per patient-year, respectively (incidence rate ratio [IRR] 1.01, 95% CI 0.75 – 1.35). When specific causative organisms were examined, no differences were observed between the groups for gram-positive (IRR 0.99, 95% CI 0.66 – 1.49), gram-negative (IRR 0.71, 95% CI 0.39 – 1.29), culture-negative (IRR 2.01, 95% CI 0.91 – 4.42), or polymicrobial peritonitis (IRR 1.08, 95% CI 0.36 – 3.20). Exit-site infection rates were 0.37 (95% CI 0.28 – 0.45) and 0.33 (95% CI 0.26 – 0.40) episodes per patient-year for the honey and control groups, respectively (IRR 1.12, 95% CI 0.81 – 1.53). No significant differences were observed between the groups for gram-positive (IRR 1.10, 95% CI 0.70 – 1.72), gram-negative (IRR: 0.85, 95% CI 0.46 – 1.58), culture-negative (IRR 1.88, 95% CI 0.67 – 5.29), or polymicrobial ESI (IRR 1.00, 95% CI 0.40 – 2.54). Times to first peritonitis-associated and first infection-associated hospitalization were similar in the honey and control groups. The rates of technique failure (PD withdrawal) due to PD-related infection were not significantly different between the groups. Compared with standard nasal mupirocin prophylaxis, daily topical exit-site application of antibacterial honey resulted in comparable rates of organism-specific peritonitis and ESI, infection-associated hospitalization, and infection-associated technique failure in PD patients.
Publisher: SAGE Publications
Date: 21-04-2021
Abstract: Peritoneal dialysis (PD) technique survival is an important outcome for patients, caregivers and health professionals, however, the definition and measures used for technique survival vary. We aimed to assess the scope and consistency of definitions and measures used for technique survival in studies of patients receiving PD. MEDLINE, EMBASE and CENTRAL databases were searched for randomised controlled studies (RCTs) conducted in patients receiving PD reporting technique survival as an outcome between database inception and December 2019. The definition and measures used were extracted and independently assessed by two reviewers. We included 25 RCTs with a total of 3645 participants (41–371 per trial) and follow up ranging from 6 weeks to 4 years. Terminology used included ‘technique survival’ (10 studies), ‘transfer to haemodialysis (HD)’ (8 studies) and ‘technique failure’ (7 studies) with 17 different definitions. In seven studies, it was unclear whether the definition included transfer to HD, death or transplantation and eight studies reported ‘transfer to HD’ without further definition regarding duration or other events. Of those remaining, five studies included death in their definition of a technique event, whereas death was censored in the other five. The duration of HD necessary to qualify as an event was reported in only four (16%) studies. Of the 14 studies reporting causes of an event, all used a different list of causes. There is substantial heterogeneity in how PD technique survival is defined and measured, likely contributing to considerable variability in reported rates. Standardised measures for reporting technique survival in PD studies are required to improve comparability.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 11-2007
DOI: 10.2215/CJN.02550607
Publisher: SAGE Publications
Date: 10-06-2022
DOI: 10.1177/11297298221099134
Abstract: To describe and compare de novo arteriovenous fistula (AVF) failure rates between Australia and New Zealand (ANZ), and Malaysia. AVFs are preferred for haemodialysis access but are limited by high rates of early failure. A post hoc analysis of 353 participants from ANZ and Malaysia included in the FAVOURED randomised-controlled trial undergoing de novo AVF surgery was performed. Composite AVF failure (thrombosis, abandonment, cannulation failure) and its in idual components were compared between ANZ ( n = 209) and Malaysian ( n = 144) participants using logistic regression adjusted for patient- and potentially modifiable clinical factors. Participants’ mean age was 55 ± 14.3 years and 64% were male. Compared with ANZ participants, Malaysian participants were younger with lower body mass index, higher prevalence of diabetes mellitus and lower prevalence of cardiovascular disease. AVF failure was less frequent in the Malaysian cohort (38% vs 54% adjusted odds ratio (OR) 0.53, 95% confidence interval (CI) 0.31–0.93). This difference was driven by lower odds of cannulation failure (29% vs 47%, OR 0.45, 95% CI 0.25–0.80), while the odds of AVF thrombosis (17% vs 20%, OR 1.24, 95% CI 0.62–2.48) and abandonment (25% vs 23%, OR 1.17, 95% CI 0.62–2.16) were similar. The risk of AVF failure was significantly lower in Malaysia compared to ANZ and driven by a lower risk of cannulation failure. Differences in practice patterns, including patient selection, surgical techniques, anaesthesia or cannulation techniques may account for regional outcome differences and warrant further investigation.
Publisher: Wiley
Date: 28-01-2016
DOI: 10.1111/NEP.12573
Abstract: The Fish oils and Aspirin in Vascular access OUtcomes in REnal Disease (FAVOURED) trial investigated whether 3 months of omega-3 polyunsaturated fatty acids, either alone or in combination with aspirin, will effectively reduce primary access failure of de novo arteriovenous fistulae. This report presents the baseline characteristics of all study participants, examines whether study protocol amendments successfully increased recruitment of a broader and more representative haemodialysis cohort, including patients already receiving aspirin, and contrasts Malaysian participants with those from Australia, New Zealand and the United Kingdom (UK). This international, randomized, double-blind, placebo-controlled trial included patients older than 19 years with stage 4 or 5 chronic kidney disease currently receiving, or planned within 12 months to receive haemodialysis. Participants (n = 568) were overweight (28.6 ± 7.3 kg/m(2) ), relatively young (54.8 ± 14.3 years), and predominantly male (63%) with a high prevalence of diabetes mellitus (46%) but low rate of ischaemic heart disease (8%). Sixty one percent were planned for lower arm arteriovenous fistula creation. Malaysian participants (n = 156) were younger (51.8 ± 13.6 years vs 57.1 ± 14.2 years, P < 0.001) with a higher prevalence of diabetes mellitus (65% vs 43%, P < 0.001), but less ischaemic heart disease (5% vs 14%, P < 0.01) compared with the combined Australian, New Zealand and UK cohort (n = 228). Protocol modifications allowing for inclusion of patients receiving aspirin increased the prevalence of co-morbidities compared with the original cohort. The FAVOURED study participants, while mostly similar to patients in contemporary national registry reports and comparable recent clinical trials, were on average younger and had less ischaemic heart disease. These differences were reduced as a consequence of including patients already receiving aspirin.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 03-2011
DOI: 10.2215/CJN.06880810
Abstract: Vitamin D is an established important contributor to muscle function and aerobic metabolism. Hypovitaminosis D is highly prevalent in CKD patients and is associated with increased cardiovascular (CV) mortality via unknown mechanisms. Because aerobic-exercise capacity strongly predicts future CV events, we hypothesized that vitamin D status could be linked to CV outcomes via an effect on maximum aerobic-exercise capacity in patients with CKD and that this effect may be mediated in part via its actions on muscle strength and functional ability. Baseline demographic, anthropometric, and biochemical data were collected in a cross-sectional study of patients with moderate CKD. Peak aerobic capacity was determined during treadmill stress testing using metabolic equivalence of tasks. Physical activity was assessed using the Active Australia questionnaire, grip strength by dynamometer, and functional capacity by “Up & Go” testing. The study included 85 participants (age 59.5 ± 9.7 years, 60% male, 44% diabetic, 92% Caucasian mean serum 25-hydroxyvitamin D [25-OHD] 78.4 ± 29.4 nmol/L). We demonstrated that 25-OHD status was independently associated with aerobic-exercise capacity (β = 0.2 P = 0.008). Aerobic-exercise capacity was also predicted by younger age, white race, smaller waist circumference, absence of a previous angina history, and increasing weekly physical activity. However, neither muscle strength nor functional ability were significantly associated with 25-OHD. Vitamin D is independently associated with aerobic capacity in CKD patients, and this finding is not explained by changes in muscle strength or functional ability.
Publisher: Wiley
Date: 22-10-2021
DOI: 10.1111/NEP.13782
Publisher: BMJ
Date: 10-2020
DOI: 10.1136/BMJOPEN-2020-038005
Abstract: Presymptomatic testing is available for early diagnosis of hereditary autosomal dominant polycystic kidney disease (ADPKD). However, the complex ethical and psychosocial implications can make decision-making challenging and require an understanding of patients’ values, goals and priorities. This study aims to describe patient and caregiver beliefs and expectations regarding presymptomatic testing for ADPKD. 154 participants (120 patients and 34 caregivers) aged 18 years and over from eight centres in Australia, France and Korea participated in 17 focus groups. Transcripts were analysed thematically. We identified five themes: avoiding financial disadvantage (insecurity in the inability to obtain life insurance, limited work opportunities, financial burden) futility in uncertainty (erratic and erse manifestations of disease limiting utility, taking preventive actions in vain, daunted by perplexity of results, unaware of risk of inheriting ADPKD) lacking autonomy and support in decisions (overwhelmed by ambiguous information, medicalising family planning, family pressures) seizing control of well-being (gaining confidence in early detection, allowing preparation for the future, reassurance in family resilience) and anticipating impact on quality of life (reassured by lack of symptoms, judging value of life with ADPKD). For patients with ADPKD, presymptomatic testing provides an opportunity to take ownership of their health through family planning and preventive measures. However, these decisions can be wrought with tensions and uncertainty about prognostic implications, and the psychosocial and financial burden of testing. Healthcare professionals should focus on genetic counselling, mental health and providing education to patients’ families to support informed decision-making. Policymakers should consider the cost burden and risk of discrimination when informing government policies. Finally, patients are recommended to focus on self-care from an early age.
Publisher: Wiley
Date: 08-01-2018
Publisher: Wiley
Date: 24-12-2014
DOI: 10.1002/PROS.22767
Abstract: In prostate cancer (PCa) patients, the protein target for androgen deprivation and blockade therapies is androgen receptor (AR). AR interacts with many proteins that function to either co-activate or co-repress its activity. Caveolin-1 (Cav-1) is not found in normal prostatic epithelium, but is found in PCa, and may be an AR co-regulator protein. We investigated cell line-specific signatures and associations of endogenous AR and Cav-1 in six PCa cell lines of known androgen sensitivity: LNCaP (androgen sensitive) 22Rv1 (androgen responsive) PC3, DU145, and ALVA41 (androgen non-reliant) and RWPE1 (non-malignant). Protein and mRNA expression profiles were compared and electron microscopy used to identify cells with caveolar structures. For cell lines expressing both AR and Cav-1, knockdown techniques using small interfering RNA against AR or Cav-1 were used to test whether diminished expression of one affected the other. Co-sedimentation of AR and Cav-1 was used to test their association. A reporter assay for AR genomic activity was utilized following Cav-1 knockdown. AR-expressing LNCaP and 22Rv1 cells had low endogenous Cav-1 mRNA and protein. Cell lines that expressed little or no AR (DU145, PC3, ALVA41, and RWPE1) expressed high endogenous levels of Cav-1. AR knockdown in LNCaP cells had little effect on Cav-1, but Cav-1 knockdown inhibited AR expression and genomic activity. These data show endogenous AR and Cav-1 mRNA and protein expression is inversely related in PCa cells, with Cav-1 acting on the androgen/AR signaling axis possibly as an AR co-activator, demonstrated by diminished AR genomic activity following Cav-1 knockdown.
Publisher: Elsevier BV
Date: 05-2016
DOI: 10.1016/J.TOXLET.2016.02.016
Abstract: Enzymes of the cytochrome P450 (CYP) super-family are implicated in cadmium (Cd) -induced nephrotoxicity, however, direct evidence is lacking. This study investigated the endogenous expression of various CYP proteins together with the stress-response proteins, heme oxygenase-1 (HO-1) and metallothionein (MT) in human kidney sections and in cadmium-exposed primary cultures of human proximal tubular epithelial cells (PTC). By immunohistochemistry, the CYP members 2B6, 4A11 and 4F2 were prominently expressed in the cortical proximal tubular cells and to a lesser extent in distal tubular cells. Low levels of CYPs 2E1 and 3A4 were also detected. In PTC, in the absence of Cd, CYP2E1, CYP3A4, CYP4F2 and MT were expressed, but HO-1, CYP2B6 and CYP4A11 were not detected. A range of cadmium concentrations (0-100μM) were utilized to induce stress conditions. MT protein was further induced by as little as 0.5μM cadmium, reaching a 6-fold induction at 20μM, whereas for HO-1, a 5μM cadmium concentration was required for initial induction and at 20μM cadmium reached a 15-fold induction. The expression of CYP2E1, CYP3A4, and CYP4F2 were not altered by any cadmium concentrations tested at 48h. Cadmium caused a reduction in cell viability at concentrations above 10μM. In conclusion although cultured PTC, do express CYP proteins, (CYP2E1, CYP3A4, and CYP4F2), Cd-induced cell stress as indicted by induction of HO-1 and MT does not alter expression of these CYP proteins at 48h.
Publisher: Wiley
Date: 31-05-2014
Publisher: SAGE Publications
Date: 07-2018
Publisher: Oxford University Press (OUP)
Date: 05-2000
DOI: 10.1093/NDT/15.5.743
Publisher: Springer Science and Business Media LLC
Date: 15-06-2012
Abstract: The aim of the study was to determine whether distance between residence and peritoneal dialysis (PD) unit influenced peritonitis occurrence, microbiology, treatment and outcomes. The study included all patients receiving PD between 1/10/2003 and 31/12/2008, using ANZDATA Registry data. 365 (6%) patients lived ≥100 km from their nearest PD unit (distant group), while 6183 (94%) lived km (local group). Median time to first peritonitis in distant patients (1.34 years, 95% CI 1.07-1.61) was significantly shorter than in local patients (1.68 years, 95% CI 1.59-1.77, p = 0.001), whilst overall peritonitis rates were higher in distant patients (incidence rate ratio 1.32, 95% CI 1.20-1.46). Living ≥100 km away from a PD unit was independently associated with a higher risk of S. aureus peritonitis (adjusted odds ratio [OR] 1.64, 95% CI 1.09-2.47). Distant patients with first peritonitis episodes were less likely to be hospitalised (64% vs 73%, p = 0.008) and receive antifungal prophylaxis (4% vs 10%, p = 0.01), but more likely to receive vancomycin-based antibiotic regimens (52% vs 42%, p 0.001). Using multivariable logistic regression analysis of peritonitis outcomes, distant patients were more likely to be cured with antibiotics alone (OR 1.55, 95% CI 1.03-2.24). All other outcomes were comparable between the two groups. Living ≥100 km away from a PD unit was associated with increased risk of S. aureus peritonitis, modified approaches to peritonitis treatment and peritonitis outcomes that were comparable to, or better than patients living closer to a PD unit. Staphylococcal decolonisation should receive particular consideration in remote living patients.
Publisher: Frontiers Media SA
Date: 05-09-2012
DOI: 10.1111/J.1432-2277.2012.01553.X
Abstract: This study analysed associations between tacrolimus, mycophenolic acid (MPA) and prednisolone exposures on day 4 and month 1 post kidney transplant and clinical outcomes. Area under the concentration-time curve (AUC) for each drug was estimated using validated multiple regression-derived limited s ling strategies. Multivariate logistic regression was used to associate drug exposure with clinical outcomes. One hundred and twenty subjects were studied. Between-subject variability in dose-adjusted exposure to each medication was high. Both day 4 tacrolimus and MPA exposures were independently predictive of delayed graft function (2.6 change in odds for a standard deviation (SD) increase in tacrolimus AUC(0-12) , P = 0.02 0.23 change in odds for a SD increase in MPA AUC(0-12) , P = 0.02). Both day 4 MPA and total prednisolone exposures were independently predictive of rejection (0.20 change in odds for a SD increase in MPA AUC(0-12) , P = 0.04 0.40 change in odds for a SD increase in total prednisolone AUC(0-6) , P = 0.03). Lowest tertile exposure to all three immunosuppressant medications imposed significantly higher odds of rejection [adjusted odds ratio 34.2 (95% CI 4.1, 284.4), P = 0.001]. This study highlights the importance of achieving early target exposure and suggests a potential role for in idualized initial dosing or early therapeutic monitoring of all three immunosuppressive agents.
Publisher: SAGE Publications
Date: 12-2001
DOI: 10.1177/089686080102103S41
Abstract: The vast majority of erythropoietin (EPO)–treated peritoneal dialysis (PD) patients require iron supplementation. Most authors and clinical practice guidelines recommend primary oral iron supplementation in PD patients because it is more practical and less expensive. However, numerous studies have clearly demonstrated that oral iron therapy is unable to maintain EPO-treated PD patients in positive iron balance. Once patients become iron-deficient, intravenous iron administration has been found to more effectively augment iron stores and hematologic response than does oral therapy. We recently performed a prospective, cross-over trial in 28 iron-replete PD patients and showed that twice-monthly outpatient iron polymaltose infusions (200 mg) were a practical and safe alternative to oral iron. That treatment produced significant increases in hemoglobin concentration and body iron stores. The additional expense of intravenous iron therapy was completely offset by reductions in EPO dosage. Careful monitoring of iron stores is important in patients receiving intravenous iron supplementation in view of epidemiologic links with infection and cardiovascular disease. Nevertheless, a growing body of evidence suggests that, as has been found for hemodialysis patients, intravenous iron therapy is superior to oral iron supplementation in EPO-treated PD patients.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 14-08-2023
DOI: 10.2215/CJN.0000000000000280
Abstract: Peritoneal dialysis represents an important treatment choice for patients with kidney failure. It allows them to dialyze outside the hospital setting, facilitating enhanced opportunities to participate in life-related activities, flexibility in schedules, time and cost savings from reduced travel to dialysis centers, and improved quality of life. Despite its numerous advantages, the peritoneal dialysis utilization has been static or diminishing in parts of the world. Peritoneal dialysis-related infections, such as peritonitis, exit-site infection or tunnel infection, is a major concern for patients, caregivers, and health professionals – which may result in hesitation to consider this as treatment or to cease therapy when these complications take place. In this review, the definition, epidemiology, risk factors, prevention, and treatment of peritoneal dialysis-related infection based on the contemporary evidence will be described.
Publisher: Elsevier BV
Date: 02-2013
DOI: 10.1053/J.AJKD.2012.09.008
Abstract: Abnormalities of cardiac structure and function are common in patients undergoing dialysis, and cardiovascular disease is the major cause of mortality in this group. Heart failure is a common clinical manifestation of cardiovascular disease and is preceded by left ventricular hypertrophy (LVH). There are variable reports about the impact of dialysis on LVH, both deleterious and beneficial. Our study investigated whether the timing of the initiation of dialysis therapy had an impact on cardiac structure and function. Randomized controlled trial. This is a cardiac substudy involving 182 patients with stage 5 chronic kidney disease in the IDEAL (Initiating Dialysis Early and Late) trial. The IDEAL trial randomly assigned patients on the basis of estimated glomerular filtration rate (eGFR), calculated using the Cockcroft-Gault equation, to start dialysis therapy early (GFR, 10-14 mL/min/1.73 m(2)), with the others starting late (GFR, 5-7 mL/min/1.73 m(2)). Echocardiograms were obtained at baseline and 12 months after randomization. Primary outcomes were change in left ventricular mass indexed for height (LVMi) between baseline and 12 months, left ventricular ejection fraction, left ventricular systolic annular velocity, ratio of mitral inflow velocity (E) to mitral annular velocity (Ea) (E/Ea), and left atrial volume indexed for height (LAVi). LVMi at baseline was elevated, but similar in both groups, with no significant change within or between groups at 12 months. E/Ea and LAVi were increased at baseline, consistent with significant diastolic dysfunction there were no differences between groups at 12 months and no changes were observed for left ventricular volumes, left ventricular ejection fraction, stroke volume, and other echocardiographic parameters. Small multicenter study using echocardiography. Advanced cardiac disease in these patients with stage 5 chronic kidney disease did not progress during the 12-month study period and planned early initiation of dialysis therapy did not result in differences in any echocardiographic variables of cardiac structure and function.
Publisher: Elsevier BV
Date: 02-2019
DOI: 10.1016/J.KINT.2018.11.008
Abstract: In November 2017, the Kidney Disease: Improving Global Outcomes (KDIGO) initiative brought a erse panel of experts in glomerular diseases together to discuss the 2012 KDIGO glomerulonephritis guideline in the context of new developments and insights that had occurred over the years since its publication. During this KDIGO Controversies Conference on Glomerular Diseases, the group examined data on disease pathogenesis, biomarkers, and treatments to identify areas of consensus and areas of controversy. This report summarizes the discussions on primary podocytopathies, lupus nephritis, anti-neutrophil cytoplasmic antibody-associated nephritis, complement-mediated kidney diseases, and monoclonal gammopathies of renal significance.
Publisher: Oxford University Press (OUP)
Date: 09-08-2020
DOI: 10.1093/NDT/GFZ160
Abstract: Withdrawal from dialysis is an increasingly common cause of death in patients with end-stage kidney disease (ESKD). As most published reports of dialysis withdrawal have been outside the Oceania region, the aims of this study were to determine the frequency, temporal pattern and predictors of dialysis withdrawal in Australian and New Zealand patients receiving chronic haemodialysis. This study included all people with ESKD in Australia and New Zealand who commenced chronic haemodialysis between 1 January 1997 and 31 December 2016, using data from the Australia and New Zealand Dialysis and Transplant (ANZDATA) Registry. Competing risk regression models were used to identify predictors of dialysis withdrawal mortality, using non-withdrawal cause of death as the competing risk event. Among 40 447 people receiving chronic haemodialysis (median age 62 years, 61% male, 9% Indigenous), dialysis withdrawal mortality rates increased from 1.02 per 100 patient-years (11% of all deaths) during the period 1997–2000 to 2.20 per 100 patient-years (32% of all deaths) during 2013–16 (P & 0.001). Variables that were significantly associated with a higher likelihood of haemodialysis withdrawal were older age {≥70 years subdistribution hazard ratio [SHR] 1.77 [95% confidence interval (CI) 1.66–1.89] reference 60–70 years}, female sex [SHR 1.14 (95% CI 1.09–1.21)], white race [Asian SHR 0.56 (95% CI 0.49–0.65), Aboriginal and Torres Strait Islander SHR 0.83 (95% CI 0.74–0.93), Pacific Islander SHR 0.47 (95% CI 0.39–0.68), reference white race], coronary artery disease [SHR 1.18 (95% CI 1.11–1.25)], cerebrovascular disease [SHR 1.15 (95% CI 1.08–1.23)], chronic lung disease [SHR 1.13 (95% CI 1.06–1.21)] and more recent era [2013–16 SHR 3.96 (95% CI 3.56–4.48) reference 1997–2000]. Death due to haemodialysis withdrawal has become increasingly common in Australia and New Zealand over time. Predictors of haemodialysis withdrawal include older age, female sex, white race and haemodialysis commencement in a more recent era.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 20-12-2019
DOI: 10.2215/CJN.05380518
Abstract: The absence of accepted patient-centered outcomes in research can limit shared decision-making in peritoneal dialysis (PD), particularly because PD-related treatments can be associated with mortality, technique failure, and complications that can impair quality of life. We aimed to identify patient and caregiver priorities for outcomes in PD, and to describe the reasons for their choices. Patients on PD and their caregivers were purposively s led from nine dialysis units across Australia, the United States, and Hong Kong. Using nominal group technique, participants identified and ranked outcomes, and discussed the reasons for their choices. An importance score (scale 0–1) was calculated for each outcome. Qualitative data were analyzed thematically. Across 14 groups, 126 participants (81 patients, 45 caregivers), aged 18–84 (mean 54, SD 15) years, identified 56 outcomes. The ten highest ranked outcomes were PD infection (importance score, 0.27), mortality (0.25), fatigue (0.25), flexibility with time (0.18), BP (0.17), PD failure (0.16), ability to travel (0.15), sleep (0.14), ability to work (0.14), and effect on family (0.12). Mortality was ranked first in Australia, second in Hong Kong, and 15th in the United States. The five themes were serious and cascading consequences on health, current and impending relevance, maintaining role and social functioning, requiring constant vigilance, and beyond control and responsibility. For patients on PD and their caregivers, PD-related infection, mortality, and fatigue were of highest priority, and were focused on health, maintaining lifestyle, and self-management. Reporting these patient-centered outcomes may enhance the relevance of research to inform shared decision-making.
Publisher: Elsevier BV
Date: 02-2012
Publisher: Elsevier BV
Date: 10-2010
DOI: 10.1053/J.AJKD.2010.06.016
Abstract: Sexual dysfunction is an under-recognized problem in men and women with chronic kidney disease (CKD). The prevalence, correlates, and predictors of this condition in patients with CKD have not been evaluated comprehensively. Systematic review and meta-analysis. Patients treated using dialysis (dialysis patients), patients treated using transplant (transplant recipients), and patients with CKD not treated using dialysis or transplant (nondialysis nontransplant patients with CKD). Observational studies conducted in patients with CKD only or including a control group without CKD. Type of study population. Sexual dysfunction in men and women with CKD using validated tools, such as the International Index of Erectile Function, the Female Sexual Function Index (FSFI), or other measures as reported by study investigators. 50 studies (8,343 patients) of variable size (range, 16-1,023 patients) were included in this review. Almost all studies explored sexual dysfunction in men and specifically erectile dysfunction. The summary estimate of erectile dysfunction in men with CKD was 70% (95% CI, 62%-77% 21 studies, 4,389 patients). Differences in reported prevalence rates of erectile dysfunction between different studies were attributable primarily to age, study populations, and type of study tool used to assess the presence of erectile dysfunction. In women, the reported prevalence of sexual dysfunction was assessed in only 306 patients from 2 studies and ranged from 30%-80%. Compared with the general population, women with CKD had a significantly lower overall FSFI score (8 studies or subgroups, 407 patients mean difference, -9.28 95% CI, -12.92 to -5.64). Increasing age, diabetes mellitus, and depression consistently were found to correlate with sexual dysfunction in 20 in idual studies of patients with CKD using different methods. Suboptimal and lack of uniform assessment of outcome measures. Sexual dysfunction is highly prevalent in both men and women with CKD, especially among those on dialysis. Larger studies enrolling different ethnic groups, using validated study tools, and analyzing the influence of various factors on the development of sexual dysfunction are needed.
Publisher: Public Library of Science (PLoS)
Date: 26-06-2020
Publisher: SAGE Publications
Date: 07-2010
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 19-02-2020
DOI: 10.2215/CJN.07070619
Abstract: Survival in pediatric kidney transplant recipients has improved over the past five decades, but changes in cause-specific mortality remain uncertain. The aim of this retrospective cohort study was to estimate the associations between transplant era and overall and cause-specific mortality for child and adolescent recipients of kidney transplants. Data were obtained on all children and adolescents (aged years) who received their first kidney transplant from 1970 to 2015 from the Australian and New Zealand Dialysis and Transplant Registry. Mortality rates were compared across eras using Cox regression, adjusted for confounders. A total of 1810 recipients (median age at transplantation 14 years, 58% male, 52% living donor) were followed for a median of 13.4 years. Of these, 431 (24%) died, 174 (40%) from cardiovascular causes, 74 (17%) from infection, 50 (12%) from cancer, and 133 (31%) from other causes. Survival rates improved over time, with 5-year survival rising from 85% for those first transplanted in 1970–1985 (95% confidence interval [95% CI], 81% to 88%) to 99% in 2005–2015 (95% CI, 98% to 100%). This was primarily because of reductions in deaths from cardiovascular causes (adjusted hazard ratio [aHR], 0.25 95% CI, 0.08 to 0.68) and infections (aHR, 0.16 95% CI, 0.04 to 0.70 both for 2005–2015 compared with 1970–1985). Compared with patients transplanted 1970–1985, mortality risk was 72% lower among those transplanted 2005–2015 (aHR, 0.28 95% CI, 0.18 to 0.69), after adjusting for potential confounders. Survival after pediatric kidney transplantation has improved considerably over the past four decades, predominantly because of marked reductions in cardiovascular- and infection-related deaths.
Publisher: Elsevier BV
Date: 04-2010
DOI: 10.1053/J.AJKD.2009.11.015
Abstract: Reports of culture-negative peritoneal dialysis (PD)-associated peritonitis have been sparse, conflicting, and limited to small single-center studies. The aim of this investigation is to examine the frequency, predictors, treatment, and outcomes of culture-negative PD-associated peritonitis. Observational cohort study using Australia and New Zealand Dialysis and Transplant Registry (ANZDATA) data. All Australian PD patients between October 1, 2003, and December 31, 2006. Demographic, clinical, and facility variables. Culture-negative PD-associated peritonitis occurrence, relapse, hospitalization, catheter removal, hemodialysis transfer, and death. Of 4,675 patients who received PD in Australia during the study period, 435 episodes of culture-negative peritonitis occurred in 361 in iduals. Culture-negative peritonitis was not associated with demographic or clinical variables. A history of previous antibiotic treatment for peritonitis was more common with culture-negative than culture-positive peritonitis (42% vs 35% P = 0.01). Compared with culture-positive peritonitis, culture-negative peritonitis was significantly more likely to be cured using antibiotics alone (77% vs 66% P < 0.001) and less likely to be complicated by hospitalization (60% vs 71% P < 0.001), catheter removal (12% vs 23% P < 0.001), permanent hemodialysis therapy transfer (10% vs 19% P < 0.001), or death (1% vs 2.5% P = 0.04). Relapse rates were similar between the 2 groups. Patients with relapsed culture-negative peritonitis were more likely to have their catheters removed (29% vs 10% [P < 0.001] OR, 3.83 95% CI, 2.00-7.32). Administration of vancomycin or cephalosporin in the initial empiric antibiotic regimen and the timing of catheter removal were not significantly associated with clinical outcomes. Limited covariate adjustment. Residual confounding and coding bias could not be excluded. Culture-negative peritonitis is a common complication with a relatively benign outcome. A history of previous antibiotic treatment is a significant risk factor for this condition.
Publisher: Informa UK Limited
Date: 10-2011
Publisher: Oxford University Press (OUP)
Date: 22-04-2010
DOI: 10.1093/NDT/GFQ222
Abstract: Coagulase-negative staphylococcal (CNS) peritonitis is the most common cause of peritoneal dialysis (PD)-associated peritonitis. Previous reports of this important condition have been sparse and generally limited to single-centre studies. The frequency, predictors, treatment and clinical outcomes of CNS peritonitis were examined by multivariate logistic regression and multilevel Poisson regression in all adult PD patients in Australia between 2003 and 2006. A total of 936 episodes of CNS peritonitis (constituting 26% of all peritonitis episodes) occurred in 620 in iduals. The observed rate of CNS peritonitis was 0.16 episodes per patient-year. Lower rates of CNS peritonitis were independently predicted by Asian racial origin (adjusted odds ratio [OR], 0.52 95% CI, 0.35-0.79), renovascular nephrosclerosis (OR, 0.40 95% CI, 0.18-0.86), early referral to a renal unit prior to dialysis commencement (OR, 0.38 95% CI, 0.19-0.79) and treatment with automated PD at any time during the PD career (OR, 0.79 95% CI, 0.66-0.96). The majority of CNS peritonitis episodes were initially treated with intraperitoneal vancomycin or cephazolin in combination with gentamicin. This regimen was changed in 533 (57%) in iduals after a median period of 3 days, most commonly to vancomycin monotherapy. The median total antibiotic course duration was 14 days. Compared with other forms of peritonitis, CNS episodes were significantly more likely to be cured by antibiotics alone (76 vs 64%, P < 0.001) and less likely to be complicated by hospitalization (61 vs 73%, P < 0.001), catheter removal (10 vs 26%, P < 0.001), temporary haemodialysis (2 vs 5%, P < 0.001), permanent haemodialysis transfer (9 vs 21%, P < 0.001) and death (1.0 vs 2.7%, P = 0.002). CNS peritonitis was also associated with a shorter duration of hospitalization, a longer time to catheter removal and a shorter duration of temporary haemodialysis. Catheter removal and permanent haemodialysis transfer were independently predicted by polymicrobial peritonitis and initial empiric administration of vancomycin (compared with cephalosporins). CNS peritonitis was associated with a higher relapse rate (17 vs 13%, P = 0.003) and relapsed CNS peritonitis was associated with a higher catheter removal rate (22 vs 7%, P 2 months). CNS peritonitis is a common complication with a relatively benign outcome compared with other forms of PD-associated peritonitis. Relapsed and repeat peritonitis are relatively common and are associated with worse outcomes.
Publisher: Elsevier BV
Date: 09-2009
DOI: 10.1038/KI.2009.202
Abstract: Fungal peritonitis is a serious complication of peritoneal dialysis but previous reports on this have been limited to small, single-center studies. Using all Australian peritoneal dialysis patients, we measured predictors, treatments, and outcomes of this condition by logistic regression and multilevel, multivariate Poisson regression. This encompassed 66 centers over a 4-year period that included 162 episodes of fungal peritonitis (4.5% of all peritonitis episodes) that occurred in 158 in iduals. Candida albicans (25%) and other Candida species (44%) were the most common fungi isolated. Fungal peritonitis was independently predicted by indigenous race and prior treatment of bacterial peritonitis. Peritonitis episodes occurring after 7 and 60 days of treatment for previous bacterial peritonitis decreases in the probability of fungal peritonitis 23 and 6%, respectively. Compared with other organisms, fungal peritonitis was associated with significantly higher rates of hospitalization, catheter removal, transfer to permanent hemodialysis, and death. The risks of repeat fungal peritonitis and death were lowest with catheter removal combined with antifungal therapy when compared to either intervention alone. Our study shows that fungal peritonitis is a serious complication of peritoneal dialysis and should be strongly suspected in the context of recent antibiotic treatment for bacterial peritonitis.
Publisher: Wiley
Date: 18-03-2021
DOI: 10.1111/NEP.13873
Abstract: With improved life expectancy over time, the burden of kidney failure resulting in kidney replacement therapy (KRT) in older persons is increasing. This study aimed to describe the age distribution at dialysis initiation in Australia and New Zealand (ANZ) across centres and over time. Adults initiating dialysis as first KRT in ANZ from 1999 to 2018 reported to the Australia and New Zealand Dialysis and Transplant (ANZDATA) Registry were included. The primary outcomes were the age distribution and the proportion of older persons (75 years and older) initiating dialysis across centres and over time. Secondary outcomes were characterization of the older population compared with younger people and differences in dialysis modality and treatment trajectories between groups. Over the study period, 55 382 people initiated dialysis as first KRT, including 10 306 older persons, in 100 centres. Wide variation in age distribution across states/countries was noted, although the proportion of older persons at dialysis initiation did not significantly change over time (from 13% in 1999 to 19% in 2003, then remaining stable thereafter). Older persons were less likely to be treated with home therapies compared with younger people. Older persons were mostly Caucasians had higher socioeconomic position, more cardiovascular comorbidities and higher eGFR at baseline and resided in major cities. Higher proportions of older persons per centre were noted in privately funded facilities. Wide variations were noted in the proportions of older persons initiating dialysis across centres and states/country, which were associated with different case‐mix across regions, particularly in terms of ethnicity, remoteness and socioeconomic advantage.
Publisher: Elsevier BV
Date: 11-2013
DOI: 10.1038/KI.2013.190
Abstract: Neutral-pH peritoneal dialysates, with reduced glucose degradation products (GDPs), have been developed to reduce peritoneal membrane damage. Here our review evaluated the impact of these solutions on clinical outcomes using data from The Cochrane CENTRAL Registry, MEDLINE, Embase, and reference lists for randomized trials of biocompatible solutions. Summary estimates of effect were obtained using a random-effects model of 20 eligible trials encompassing 1383 patients. The quality of studies was generally poor, such that 13 studies had greater than a 20% loss to follow-up and only 3 trials reported adequate concealment of allocation. Use of neutral-pH dialysates with reduced GDPs resulted in larger urine volumes (7 trials 520 patients mean difference 126 ml/day, 95% CI 27-226), improved residual renal function after 12 months (6 trials 360 patients standardized mean difference 0.31, 95% confidence interval 0.10-0.52), and a trend to reduced inflow pain (1 trial 58 patients relative risk 0.51, 95% CI 0.24-1.08). However, there was no significant effect on body weight, hospitalization, peritoneal solute transport rate, peritoneal small-solute clearance, peritonitis, technique failure, patient survival, or adverse events. No significant harms were identified. Thus, based on generally poor quality trials, the use of neutral-pH peritoneal dialysates with reduced GDPs resulted in greater urine volumes and residual renal function after 12 months, but without other clinical benefits. Larger, better-quality studies are needed for accurate evaluation of the impact of these newer dialysates on patient-level hard outcomes.
Publisher: Elsevier BV
Date: 11-2019
Publisher: BMJ
Date: 2019
DOI: 10.1136/BMJOPEN-2018-024551
Abstract: To evaluate the feasibility and acceptability of a personalised telehealth intervention to support dietary self-management in adults with stage 3–4 chronic kidney disease (CKD). Mixed-methods process evaluation embedded in a randomised controlled trial. People with stage 3–4 CKD (estimated glomerular filtration rate [eGFR]15–60 mL/min/1.73 m 2 ). Participants were recruited from three hospitals in Australia and completed the intervention in ambulatory community settings. The intervention group received one telephone call per fortnight and 2–8 tailored text messages for 3 months, and then 4–12 tailored text messages for 3 months without telephone calls. The control group received usual care for 3 months then non-tailored education-only text messages for 3 months. Feasibility (recruitment, non-participation and retention rates, intervention fidelity and participant adherence) and acceptability (questionnaire and semistructured interviews). Descriptive statistics and qualitative content analysis. Overall, 80/230 (35%) eligible patients who were approached consented to participate (mean±SD age 61.5±12.6 years). Retention was 93% and 98% in the intervention and control groups, respectively, and 96% of all planned intervention calls were completed. All participants in the intervention arm identified the tailored text messages as useful in supporting dietary self-management. In the control group, 27 (69%) reported the non-tailored text messages were useful in supporting change. Intervention group participants reported that the telehealth programme delivery methods were practical and able to be integrated into their lifestyle. Participants viewed the intervention as an acceptable, personalised alternative to face-face clinic consultations, and were satisfied with the frequency of contact. This telehealth-delivered dietary coaching programme is an acceptable intervention which appears feasible for supporting dietary self-management in stage 3–4 CKD. A larger-scale randomised controlled trial is needed to evaluate the efficacy of the coaching programme on clinical and patient-reported outcomes. ACTRN12616001212448 Results.
Publisher: Elsevier BV
Date: 11-2020
Publisher: Wiley
Date: 12-2010
DOI: 10.1111/J.1440-1797.2010.01428.X
Abstract: Randomized controlled trials are the ideal study design to evaluate the effectiveness of health-care interventions. The conduct of a clinical trial is a collaborative effort between participants, investigators and a range of health-care professionals involved both centrally and locally in the coordination and execution of the trial. In this article, the key steps that are required to design a randomized controlled trial are summarized.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 06-2012
Publisher: SAGE Publications
Date: 11-2001
DOI: 10.1177/089686080002000630
Abstract: To investigate the degree and the determinants of peritoneal homocysteine (Hcy) clearance and to compare measured Hcy clearance with the Hcy clearance predicted based on molecular weight (MW). Cross-sectional observational analysis. Tertiary care institutional dialysis center. Sixty-five stable peritoneal dialysis (PD) patients. Fasting blood and 24-hour pooled dialysate effluents were collected for determination of peritoneal clearances of Hcy (CpHcy), urea (CpUr), and creatinine (CpCr). The dialysate-to-plasma creatinine ratio at 4 hours (D/P Cr 4 h) and levels of red cell folate, B 12 , ferritin, and C-reactive protein (CRP) were measured concurrently. Observed CpHcy was compared with predicted clearance, based on Hcy plasma protein binding and the relative molecular weights of Hcy, urea, and creatinine. Plasma concentrations of Hcy averaged 24.6 ± 1.1 μmol/L and were elevated above the upper limit of normal in 59 (91%) patients. The mean dialysate concentration of Hcy was 2.9 ± 0.3 μmol/L, equating to a daily peritoneal elimination of 34.6 ± 3.6 μmol. Observed CpHcy was closely approximated by predicted CpHcy (8.7 ± 0.6 L/week/1.73 m 2 vs 9.0 ± 0.3 L/week/1.73 m 2 respectively, p = 0.55). Patients maintained on automated PD ( n = 5) had a CpHcy similar to that of patients treated with continuous ambulatory peritoneal dialysis (8.9 ± 1.0 L/week/1.73 m 2 vs 8.7 ± 0.6 L/week/1.73 m 2 , p = 0.92). The CpHcy was significantly correlated with C-reactive protein (CRP), D/P creatinine, CpUr, CpCr, and peritoneal protein loss, but not with plasma Hcy, albumin, B 12 , ferritin, age, dialysis duration, peritonitis episodes, or daily dialysate effluent volume. By multivariate analysis, the only variables that remained significant independent predictors of CpHcy were CRP and D/P Cr 4 h. High and high-average transporters had a higher CpHcy than low and low-average transporters (9.7 ± 0.8 L/week/1.73 m 2 vs 7.0 ± 0.7 L/week/1.73 m 2 , p 0.05), despite comparably elevated plasma Hcy concentrations [25.2 ± 1.5 μmol/L vs 23.4 ± 1.6 μmol/L, p = nonsignificant (NS)]. Elevated plasma concentrations of Hcy are not efficiently reduced by PD. The relatively low peritoneal clearance of Hcy is largely accounted for by a high degree of plasma protein binding and is significantly influenced by peritoneal membrane permeability.
Publisher: Wiley
Date: 31-01-2003
DOI: 10.1046/J.1440-1797.2003.00119.X
Abstract: Culture-negative peritoneal inflammation accounts for between 5 and 20% of cases of peritonitis in peritoneal dialysis patients. Diagnostic yields may be enhanced considerably by reculturing dialysate effluents using appropriate collection methods and optimal laboratory techniques (including prolonged low-temperature and anaerobic incubations). In patients with persistent culture-negative peritonitis, consideration should be given to the possibilities of unusual or fastidious microorganisms (especially fungi and mycobacteria) and non-infective causes (especially drug reactions, malignancy, visceral inflammation and retroperitoneal inflammation). In this paper, an illustrative case of persistent culture-negative peritonitis is presented followed by a discussion of the investigative approach to such patients, with particular emphasis on differential diagnosis and the limitations of currently available tests.
Publisher: Oxford University Press (OUP)
Date: 28-02-2019
DOI: 10.1093/NDT/GFZ034
Abstract: Risk of encapsulating peritoneal sclerosis (EPS) is strongly associated with the duration of peritoneal dialysis (PD), such that patients who have been on PD for some time may consider elective transfer to haemodialysis to mitigate the risk of EPS. There is a need to determine this risk to better inform clinical decision making, but previous studies have not allowed for the competing risk of death. This study included new adult PD patients in Australia and New Zealand (ANZ 1990–2010) or Scotland (2000–08) followed until 2012. Age, time on PD, primary renal disease, gender, data set and diabetic status were evaluated as predictors at the start of PD, then at 3 and 5 years after starting PD using flexible parametric competing risks models. In 17 396 patients (16 162 ANZ, 1234 Scotland), EPS was observed in 99 (0.57%) patients, less frequently in ANZ patients (n = 65 0.4%) than in Scottish patients (n = 34 2.8%). The estimated risk of EPS was much lower when the competing risk of death was taken into account (1 Kaplan–Meier = 0.0126, cumulative incidence function = 0.0054). Strong predictors of EPS included age, primary renal disease and time on PD. The risk of EPS was reasonably discriminated at the start of PD (C-statistic = 0.74–0.79) and this improved at 3 and 5 years after starting PD (C-statistic = 0.81–0.92). EPS risk estimates are lower when calculated using competing risk of death analyses. A patient’s estimated risk of EPS is country-specific and can be predicted using age, primary renal disease and duration of PD.
Publisher: Wiley
Date: 04-2002
Publisher: Springer Science and Business Media LLC
Date: 02-04-2019
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 11-2003
DOI: 10.1097/01.ASN.0000091587.55159.5F
Abstract: Although obesity is associated with increased risks of morbidity and death in the general population, a number of studies of patients undergoing hemodialysis have demonstrated that increasing body mass index (BMI) is correlated with decreased mortality risk. Whether this association holds true among patients treated with peritoneal dialysis (PD) has been less well studied. The aim of this investigation was to examine the association between BMI and outcomes among new PD patients in a large cohort, with long-term follow-up monitoring. Using data from the Australia and New Zealand Dialysis and Transplant Registry, an analysis of all new adult patients ( n = 9679) who underwent an episode of PD treatment in Australia or New Zealand between April 1, 1991, and March 31, 2002, was performed. Patients were classified as obese (BMI of ≥30 kg/m 2 ), overweight (BMI of 25.0 to 29.9 kg/m 2 ), normal weight (BMI of 20 to 24.9 kg/m 2 ), or underweight (BMI of kg/m 2 ). In multivariate analyses, obesity was independently associated with death during PD treatment (hazard ratio, 1.36 95% confidence interval, 1.14 to 1.54 P 0.05) and technique failure (hazard ratio, 1.17 95% confidence interval, 1.07 to 1.26 P 0.01), except among patients of New Zealand Maori/Pacific Islander origin, for whom there was no significant relationship between BMI and death during PD treatment. A supplementary fractional polynomial analysis modeled BMI as a continuous predictor and indicated a J-shaped relationship between BMI and patient mortality rates and a steady increase in death-censored technique failure rates up to a BMI of 40 kg/m 2 the mortality risk was lowest for BMI values of approximately 20 kg/m 2 . In conclusion, obesity at the commencement of renal replacement therapy is a significant risk factor for death and technique failure. Such patients should be closely monitored during PD and should be considered for early transfer to an alternative renal replacement therapy if difficulties are experienced. E-mail: stephenm@ anzdata.org.au
Publisher: Elsevier BV
Date: 10-2017
DOI: 10.1053/J.AJKD.2016.11.029
Abstract: Survival and quality of life for patients on hemodialysis therapy remain poor despite substantial research efforts. Existing trials often report surrogate outcomes that may not be relevant to patients and clinicians. The aim of this project was to generate a consensus-based prioritized list of core outcomes for trials in hemodialysis. In a Delphi survey, participants rated the importance of outcomes using a 9-point Likert scale in round 1 and then re-rated outcomes in rounds 2 and 3 after reviewing other respondents' scores. For each outcome, the median, mean, and proportion rating as 7 to 9 (critically important) were calculated. 1,181 participants (202 [17%] patients/caregivers, 979 health professionals) from 73 countries completed round 1, with 838 (71%) completing round 3. Outcomes included in the potential core outcome set met the following criteria for both patients/caregivers and health professionals: median score ≥ 8, mean score ≥ 7.5, proportion rating the outcome as critically important ≥ 75%, and median score in the forced ranking question < 10. Patients/caregivers rated 4 outcomes higher than health professionals: ability to travel, dialysis-free time, dialysis adequacy, and washed out after dialysis (mean differences of 0.9, 0.5, 0.3, and 0.2, respectively). Health professionals gave a higher rating for mortality, hospitalization, decrease in blood pressure, vascular access complications, depression, cardiovascular disease, target weight, infection, and potassium (mean differences of 1.0, 1.0, 1.0, 0.9, 0.9, 0.8, 0.7, 0.4, and 0.4, respectively). The Delphi survey was conducted online in English and excludes participants without access to a computer and internet connection. Patients/caregivers gave higher priority to lifestyle-related outcomes than health professionals. The prioritized outcomes for both groups were vascular access problems, dialysis adequacy, fatigue, cardiovascular disease, and mortality. This process will inform a core outcome set that in turn will improve the relevance, efficiency, and comparability of trial evidence to facilitate treatment decisions.
Publisher: Elsevier BV
Date: 07-2012
DOI: 10.1053/J.JRN.2011.08.007
Abstract: To determine vitamin D status in a subtropical climate among an unselected, referred predialysis chronic kidney disease (CKD) population assess risks and correlates and review whether higher 25-hydroxyvitamin D (25-OHD) concentration can mitigate the decrement in circulating 1,25-dihydroxyvitamin D (1,25-OHD) normally encountered with advancing CKD. Prospective cross-sectional cohort study. Renal unit in Brisbane, Australia (27°28' S). Five hundred ninety-three consecutive CKD patients (stage 1 to 5). 25-OHD insufficiency (concentrations: 15 to 30 ng/mL) and deficiency (<15 ng/mL), bone-mineral parameters, including 1,25-OHD, calcium, and phosphate. Despite potentially higher environmental ultraviolet (UV) exposure, only 48% of patients with CKD were 25-OHD sufficient. Traditional risks for hypovitaminosis D were maintained, and sufficiency was independently predicted by testing in the summer/autumn period (odds ratio [OR]: 2.77, 95% confidence interval [CI]: 1.88 to 4.08, P < .001), male gender (OR: 2.18, 95%CI: 1.46 to 3.24, P < .001), Caucasian race (OR: 2.28, 95%CI: 1.37 to 3.78, P = .001), hypoalbuminemia (OR: 0.47, 95%CI: 0.25 to 0.85, P = .01), macroalbuminuria (OR: 0.60, 95%CI: 0.39 to 0.92, P = .02), and normal body mass index (OR: 1.94, 95%CI: 1.22 to 3.07, P = .005). Vitamin D sufficiency was also associated with higher corrected calcium (0.4 mg/dL increments OR: 1.29, 95%CI: 1.08 to 1.55, P = .005). Although circulating 25-OHD concentrations were relatively maintained across the range of renal function observed, 1,25-OHD concentrations decreased with advancing CKD. 25-OHD insufficiency is mitigated but still highly prevalent in patients with CKD in a high ambient UV environment. Despite the maintenance of relatively higher 25-OHD concentrations with advancing CKD, substrate availability does not appear to be a major determinant of circulating 1,25-OHD.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 10-2009
DOI: 10.2215/CJN.01750309
Publisher: John Wiley & Sons, Ltd
Date: 15-04-2009
Publisher: SAGE Publications
Date: 17-01-2020
Abstract: There is substantial variation in peritonitis rates across peritoneal dialysis (PD) units globally. This may, in part, be related to the wide variability in the content and delivery of training for PD nurse trainers and patients. The aim of this study was to test the feasibility of implementing the Targeted Education ApproaCH to improve Peritoneal Dialysis Outcomes (TEACH-PD) curriculum in real clinical practice settings. This study used mixed methods including questionnaires and semi-structured interviews (pretraining and post-training) with nurse trainers and patients to test the acceptability and usability of the PD training modules implemented in two PD units over 6 months. Quantitative data from the questionnaires were analysed descriptively. Interviews were analysed using thematic analysis. Ten PD trainers and 14 incident PD patients were included. Mean training duration to complete the modules were 10.9 h (range 6–17) and 24.9 h (range 15–35), for PD trainers and patients, respectively. None of the PD patients experienced PD-related complications at 30 days follow-up. Three (21%) patients were transferred to haemodialysis due to non-PD–related complications. Ten trainers and 14 PD patients participated in the interviews. Four themes were identified including use of adult learning principles (trainers), comprehension of online modules (trainers), time to complete the modules (trainers) and patient usability of the manuals (patient). This TEACH-PD study has demonstrated feasibility of implementation in a real clinical setting. The outcomes of this study have informed refinement of the TEACH-PD modules prior to rigorous evaluation of its efficacy and cost-effectiveness in a large-scale study.
Publisher: Wiley
Date: 06-02-2007
DOI: 10.1111/J.1440-1797.2006.00712.X
Abstract: Poor control of bone mineral metabolism (BMM) is associated with renal osteodystrophy and mortality in dialysis-dependent patients. The authors explored the efficacy of alternate nightly home haemodialysis (ANHHD) in controlling BMM parameters and its effects on bone mineral density and histomorphometry. In this prospective observational study, 26 patients on home haemodialysis (3-5 h, 3.5-4 sessions weekly) were converted to ANHHD (6-9 h, 3.5-4 sessions weekly). Biochemical parameters of BMM at baseline, 6 and 12 months, radiological parameters at baseline and 12 months and bone histomorphometry at 12 months are described. Pre-dialysis serum phosphate fell from 2.13+/-0.65 to 1.38+/-0.35 mmol/L P<0.0001. No binders were required in 77.2% compared with 7.7% at baseline. Calcium-phosphate product fell from 5.28+/-1.64 to 3.42+/-0.88 mmol2/L2 P 1000 ng/L) did not significantly improve parathyroid hormone status. Abnormal bone turnover and mineralization were present in a significant proportion of patients at 12 months but low turnover was uncommon. Vascular calcification was stabilized or improved in the majority. ANHHD compares favourably with every night and short daily therapy in relation to BMM management and may offer lifestyle advantages for patients.
Publisher: Springer Science and Business Media LLC
Date: 17-04-2018
Publisher: SAGE Publications
Date: 2019
Abstract: Chronic kidney disease (CKD) is a significant health problem in Canada. Understanding the capacity of the Canadian health-care system to deliver kidney care is important to provide optimal care. To compare Canada’s position in relation to countries of similar economic standing. Cross-sectional electronic survey. Member countries of the Organisation for Economic Co-operation and Development (OECD) that participated in the survey. Nephrologists, other physicians, policymakers, and other professionals with relevant expertise in kidney care. Not applicable. A survey administered by the International Society of Nephrology assessed the global capacity of kidney care delivery. Data from participating OECD countries were analyzed using descriptive statistics to compare Canada’s position. Of the participating countries, most funded kidney care services (non-medication) by government (transplantation: 85%, dialysis: 81%, acute kidney injury (AKI): 77%). Most countries covered medication. Canada reported a public funding model for kidney services and a mix of public and private sources for medication. Nephrologists and nephrology trainee densities were lower in Canada compared to the median (15.33 vs. 25.82 and 1.74 vs. 3.94, respectively). CKD was recognized as a health priority in five countries, but not in Canada. Registries for CKD did not exist in most (24/26) countries. Canada followed a national strategy for noncommunicable diseases, but this was not specific to CKD care, dialysis, or transplantation. Risks of recall bias or social desirability bias are present. Differences in a number of factors could influence discrepancies among countries and were not explored. Responses reflected the existence of practices, policies, and strategies, and may not necessarily describe action or impact. Capacity of care is not equal across all regions and provinces within Canada however, the findings are reported on a national level and therefore may not appropriately address variability. This study describes the capacity for kidney care at a national level within the context of the Canadian health system. The Canadian health-care system is well funded by the government however, there are areas that could be improved to increase the optimization of kidney care provided.
Publisher: SAGE Publications
Date: 05-2018
Abstract: Obesity is increasingly prevalent worldwide, and a greater number of patients initiate renal replacement therapy with a high body mass index (BMI). This study aimed to evaluate the association between BMI and organism-specific peritonitis. All adult patients who initiated peritoneal dialysis (PD) in Australia between January 2004 and December 2013 were included. Data were accessed through the Australia and New Zealand Dialysis and Transplant (ANZDATA) Registry. The co-primary outcomes of this study were time to first organism-specific peritonitis episode, specifically gram-positive, gram-negative, culture-negative, and fungal. Secondary outcomes were in idual rates of organism-specific peritonitis for the same 4 microbiological categories. There were 7,381 peritonitis episodes among the 8,343 incident PD patients evaluated. After multivariable adjustment, obese patients (BMI 30 – 34.9 kg/m 2 ) had an increased risk of fungal peritonitis (adjusted hazard ratio [HR] 1.69, 95% confidence interval [CI] 1.18 – 2.42), very obese patients (BMI ≥ 35 kg/m 2 ) had a significantly higher risk of gram-positive peritonitis (HR 1.15, 95% CI 1.02 – 1.30), while both obese and very obese patients experienced significantly higher risks of gram-negative peritonitis (HR 1.29, 95% CI 1.11 – 1.50 and HR 1.30, 95% CI 1.08 – 1.57, respectively) compared with patients with normal BMI (20 – 24.9 kg/m 2 ). Obesity and severe obesity were independently associated with increased incidence rate ratios of all forms of organism-specific peritonitis with a non-significant trend for severe obesity and gram-negative peritonitis association. Among Australian patients, obesity and severe obesity are associated with significantly increased rates of gram-positive, gram-negative, fungal, and culture-negative peritonitis.
Publisher: Scitechnol Biosoft Pvt. Ltd.
Date: 2017
Publisher: Oxford University Press (OUP)
Date: 24-07-2009
DOI: 10.1093/NDT/GFP366
Publisher: BMJ
Date: 04-2019
DOI: 10.1136/BMJOPEN-2018-025653
Abstract: To provide a broad evaluation of the efficacy and safety of oral Chinese herbal medicine (CHM) as an adjunctive treatment for diabetic kidney disease (DKD), including mortality, progression to end-stage kidney disease (ESKD), albuminuria, proteinuria and kidney function. A systematic review and meta-analysis. Randomised controlled trials (RCTs) comparing oral CHM with placebo as an additional intervention to conventional treatments were retrieved from five English (Cochrane Central Register of Controlled Trials, MEDLINE, Embase, Allied and Complementary Medicine Database and Cumulative Index of Nursing and Allied Health Literature) and four Chinese databases (China BioMedical Literature, China National Knowledge Infrastructure, Chonqing VIP and Wanfang) from inception to May 2018. RCTs recruiting adult DKD patients induced by primary diabetes were considered eligible, regardless of the form and ingredients of oral CHM. Mean difference (MD) or standardised mean difference (SMD) was used to analyse continuous variables and RR for dichotomous data. From 7255 reports retrieved, 20 eligible studies involving 2719 DKD patients were included. CHM was associated with greater reduction of albuminuria than placebo, regardless of whether renin–angiotensin system (RAS) inhibitors were concurrently administered (SMD −0.56, 95% CI [−1.04 to –0.08], I 2 =64%, p=0.002) or not (SMD −0.92, 95% CI [−1.35 to –0.51], I 2 =87%, p .0001). When CHM was used as an adjunct to RAS inhibitors, estimated glomerular filtration rate was higher in the CHM than placebo group (MD 6.28 mL/min 95% CI [2.42 to 10.14], I 2 =0%, p=0.001). The effects of CHM on progression to ESKD and mortality were uncertain due to low event rates. The reported adverse events in CHM group included digestive disorders, elevated liver enzyme level, infection, anaemia, hypertension and subarachnoid haemorrhage, but the report rates were low and similar to control groups. The favourable results of CHM should be balanced with the limitations of the included studies such as high heterogeneity, short follow-up periods, small numbers of clinical events and older patients with less advanced disease. Based on moderate to low quality evidence, CHM may have beneficial effects on renal function and albuminuria beyond that afforded by conventional treatment in adults with DKD. Further well-conducted, adequately powered trials with representative DKD populations are warranted to confirm the long-term effect of CHM, particularly on clinically relevant outcomes. CRD42015029293.
Publisher: Elsevier BV
Date: 09-2015
DOI: 10.1016/J.NUMECD.2015.03.015
Abstract: Indoxyl sulfate (IS) and p-cresyl sulfate (PCS) are uremic toxins derived solely from colonic bacterial fermentation of protein. Dietary fiber may counteract this by limiting proteolytic bacterial fermentation. However, the influence of dietary intake on the generation of IS and PCS has not been adequately explored in chronic kidney disease (CKD). This cross-sectional study included 40 CKD participants (60% male age 69 ± 10 years 45% diabetic) with a mean estimated glomerular filtration rate (eGFR) of 24 ± 8 mL/min/1.73 m(2), who enrolled in a randomized controlled trial of synbiotic therapy. Total and free serum IS and PCS were measured at baseline by ultra-performance liquid chromatography. Dietary intake was measured using in-depth diet histories collected by a dietitian. Associations between each toxin, dietary fiber (total, soluble and insoluble), dietary protein (total, and amino acids: tryptophan, tyrosine and phenylalanine), and the protein-fiber index (ratio of protein to fiber) were assessed using linear regression. Dietary fiber was associated with free and total serum PCS (r = -0.42 and r = -0.44, both p < 0.01), but not IS. No significant association was observed between dietary protein and either toxin. The protein-fiber index was associated with total serum IS (r = 0.40, p = 0.012) and PCS (r = 0.43, p = 0.005), independent of eGFR, sex and diabetes. Dietary protein-fiber index is associated with serum IS and PCS levels. Such association, beyond fiber and protein alone, highlights the importance of the interplay between these nutrients. We speculate that dietary modification towards a lower protein-fiber index may contribute to lowering IS and PCS.
Publisher: Springer Science and Business Media LLC
Date: 04-09-2017
Publisher: SAGE Publications
Date: 05-2012
Publisher: Elsevier BV
Date: 05-2018
DOI: 10.1053/J.AJKD.2017.12.003
Abstract: Vascular access outcomes in hemodialysis are critically important for patients and clinicians, but frequently are neither patient relevant nor measured consistently in randomized trials. A Standardized Outcomes in Nephrology-Hemodialysis (SONG-HD) consensus workshop was convened to discuss the development of a core outcome measure for vascular access. 13 patients/caregivers and 46 professionals (clinicians, policy makers, industry representatives, and researchers) attended. Participants advocated for vascular access function to be a core outcome based on the broad applicability of function regardless of access type, involvement of a multidisciplinary team in achieving a functioning access, and the impact of access function on quality of life, survival, and other access-related outcomes. A core outcome measure for vascular access required demonstrable feasibility for implementation across different clinical and trial settings. Participants advocated for a practical and flexible outcome measure with a simple actionable definition. Integrating patients' values and preferences was warranted to enhance the relevance of the measure. Proposed outcome measures for function included "uninterrupted use of the access without the need for interventions" and "ability to receive prescribed dialysis," but not "access blood flow," which was deemed too expensive and unreliable. These recommendations will inform the definition and implementation of a core outcome measure for vascular access function in hemodialysis trials.
Publisher: SAGE Publications
Date: 11-2012
Abstract: Since the mid-1990s, early dialysis initiation has dramatically increased in many countries. The Initiating Dialysis Early and Late (IDEAL) study demonstrated that, compared with late initiation, planned early initiation of dialysis was associated with comparable clinical outcomes and increased health care costs. Because residual renal function is a key determinant of outcome and is better preserved with peritoneal dialysis (PD), the present pre-specified subgroup analysis of the IDEAL trial examined the effects of early- compared with late-start dialysis on clinical outcomes in patients whose planned therapy at the time of randomization was PD. Adults with an estimated glomerular filtration rate (eGFR) of 10 – 15 mL/min/1.73 m 2 who planned to be treated with PD were randomly allocated to commence dialysis at an eGFR of 10 – 14 mL/min/1.73 m 2 (early start) or 5 – 7 mL/min/1.73 m 2 (late start). The primary outcome was all-cause mortality. Of the 828 IDEAL trial participants, 466 (56%) planned to commence PD and were randomized to early start ( n = 233) or late start ( n = 233). The median times from randomization to dialysis initiation were, respectively, 2.03 months [interquartile range (IQR):1.67 – 2.30 months] and 7.83 months (IQR: 5.83 – 8.83 months). Death occurred in 102 early-start patients and 96 late-start patients [hazard ratio: 1.04 95% confidence interval (CI): 0.79 – 1.37]. No differences in composite cardiovascular events, composite infectious deaths, or dialysis-associated complications were observed between the groups. Peritonitis rates were 0.73 episodes (95% CI: 0.65 – 0.82 episodes) per patient–year in the early-start group and 0.69 episodes (95% CI: 0.61 – 0.78 episodes) per patient–year in the late-start group (incidence rate ratio: 1.19 95% CI: 0.86 – 1.65 p = 0.29). The proportion of patients planning to commence PD who actually initiated dialysis with PD was higher in the early-start group (80% vs 70%, p = 0.01). Early initiation of dialysis in patients with stage 5 chronic kidney disease who planned to be treated with PD was associated with clinical outcomes comparable to those seen with late dialysis initiation. Compared with early-start patients, late-start patients who had chosen PD as their planned dialysis modality were less likely to commence on PD.
Publisher: Springer Science and Business Media LLC
Date: 03-10-2011
Publisher: Oxford University Press (OUP)
Date: 08-2019
DOI: 10.1093/NDT/GFZ148
Abstract: Vascular access outcomes reported across haemodialysis (HD) trials are numerous, heterogeneous and not always relevant to patients and clinicians. This study aimed to identify critically important vascular access outcomes. Outcomes derived from a systematic review, multi-disciplinary expert panel and patient input were included in a multilanguage online survey. Participants rated the absolute importance of outcomes using a 9-point Likert scale (7–9 being critically important). The relative importance was determined by a best–worst scale using multinomial logistic regression. Open text responses were analysed thematically. The survey was completed by 873 participants [224 (26%) patients/caregivers and 649 (74%) health professionals] from 58 countries. Vascular access function was considered the most important outcome (mean score 7.8 for patients and caregivers/8.5 for health professionals, with 85%/95% rating it critically important, and top ranked on best–worst scale), followed by infection (mean 7.4/8.2, 79%/92% rating it critically important, second rank on best–worst scale). Health professionals rated all outcomes of equal or higher importance than patients/caregivers, except for aneurysms. We identified six themes: necessity for HD, applicability across vascular access types, frequency and severity of debilitation, minimizing the risk of hospitalization and death, optimizing technical competence and adherence to best practice and direct impact on appearance and lifestyle. Vascular access function was the most critically important outcome among patients/caregivers and health professionals. Consistent reporting of this outcome across trials in HD will strengthen their value in supporting vascular access practice and shared decision making in patients requiring HD.
Publisher: Hindawi Limited
Date: 2011
DOI: 10.1155/2011/750836
Abstract: The incidence of BK virus infection in kidney transplant recipients has increased over recent decades, coincident with the use of more potent immunosuppression. More importantly, posttransplant BK virus replication has emerged as an important cause of graft damage and subsequent graft loss. Immunosuppression has been accepted as a major risk for BK virus replication. However, the specific contribution of in idual immunosuppressive medications to this risk has not been well established. The purpose of this paper is to provide an overview of the recent literature on the influence of the various immunosuppressant drugs and drug combinations on posttransplant BK virus replication. Evidence supporting the various immunosuppression reduction strategies utilised in the management of BK virus will also be briefly discussed.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 05-2009
DOI: 10.2215/CJN.00010109
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 15-03-2014
DOI: 10.1097/01.TP.0000436906.48802.C4
Abstract: Emerging evidence suggests that uremic toxins, in particular indoxyl sulfate (IS) and p-cresyl sulfate (PCS), may be involved in the pathogenesis of cardiovascular disease. Despite a significant increase in IS and PCS in patients with established kidney damage, the effect of a nephrectomy in non-chronic kidney disease patients is not yet known. Forty-two living kidney donors (Caucasian 76% female [n=32] 53 ± 10 years) were enrolled in an observational cohort study and followed up annually for 2 years (before nephrectomy, 1 and 2 years after nephrectomy). At each time point, patients underwent measurements of serum total and free IS and PCS (using ultrahigh-performance liquid chromatography), carotid intima-media thickness (a measure of arterial stiffness), brachial artery reactivity (both flow-mediated dilatation and sublingual glycerol trinitrate, markers of endothelial dysfunction), kidney function by Chronic Kidney Disease Epidemiology Collaboration creatinine-cystatin C, and urate and high-sensitivity C-reactive protein using standard laboratory techniques. Kidney function decreased by 30% after nephrectomy (absolute change estimated glomerular filtration rate 28 ± 6.9 and 27 ± 7.6 mL/min/1.73 m at 1 and 2 years, respectively), and the concentration of toxin levels increased by 44% to 100%, which remained elevated at 2 years after nephrectomy (all P<0.001). Both toxins were associated with carotid intima-media thickness, brachial artery reactivity-glycerol trinitrate, serum urate, and C-reactive protein levels (all P<0.03). Further, IS and urate were found to be independent predictors of change in kidney function, from baseline at 2 years after nephrectomy (both P<0.03). This study demonstrated significant and sustained increases in nephrovascular toxins, IS and PCS, after nephrectomy. Levels of both toxins were associated with clinically relevant markers of cardiovascular and renal risk, warranting further research in this area.
Publisher: SAGE Publications
Date: 05-2009
DOI: 10.1177/089686080902900315
Abstract: The primary objective of this study is to determine whether daily exit-site application of standardized antibacterial honey (Medihoney Antibacterial Wound Gel Comvita, Te Puke, New Zealand) results in a reduced risk of catheter-associated infections in peritoneal dialysis (PD) patients compared with standard topical mupirocin prophylaxis of nasal staphylococcal carriers. Multicenter, prospective, open label, randomized controlled trial. PD units throughout Australia and New Zealand. The study will include both incident and prevalent PD patients (adults and children) for whom informed consent can be provided. Patients will be excluded if they have had ( 1 ) a history of psychological illness or condition that interferes with their ability to understand or comply with the requirements of the study ( 2 ) recent (within 1 month) exit-site infection, peritonitis, or tunnel infection ( 3 ) known hypersensitivity to, or intolerance of, honey or mupirocin ( 4 ) current or recent (within 4 weeks) treatment with an antibiotic administered by any route or ( 5 ) nasal carriage of mupirocin-resistant Staphylococcus aureus. 370 subjects will be randomized 1:1 to receive either daily topical exit-site application of Medihoney Antibacterial Wound Gel (all patients) or nasal application of mupirocin if staphylococcal nasal carriage is demonstrated. All patients in the control and intervention groups will perform their usual exit-site care according to local practice. The study will continue until 12 months after the last patient is recruited (anticipated recruitment time is 24 months). The primary outcome measure will be time to first episode of exit-site infection, tunnel infection, or peritonitis, whichever comes first. Secondary outcome measures will include time to first exit-site infection, time to first tunnel infection, time to first peritonitis, time to infection-associated catheter removal, catheter-associated infection rates, causative organisms, incidence of mupirocin-resistant microbial isolates, and other adverse reactions. This multicenter Australian and New Zealand study has been designed to provide evidence to help nephrologists and their PD patients determine the optimal strategy for preventing PD catheter-associated infections. Demonstration of a significant improvement in PD catheter-associated infections with topical Medihoney will provide clinicians with an important new prophylactic strategy with a low propensity for promoting antimicrobial resistance.
Publisher: Wiley
Date: 26-02-2017
DOI: 10.1111/SDI.12580
Abstract: People with kidney disease are advised to restrict in idual nutrients, such as sodium, potassium, and phosphate, in line with current best practice guidelines. However, there is limited evidence to support the efficacy of single nutrient strategies, and compliance remains a challenge for clinicians to overcome. Many factors contribute to poor compliance with dietary prescriptions, including conflicting priorities for single nutrient restriction, the arduous self-monitoring required, and the health-related knock-on effects resulting from targeting these nutrients in isolation. This paper reviews the evidence base for the overall pattern of eating as a potential tool to deliver a diet intervention in which all the nutrients and foods work cumulatively and synergistically to improve clinical outcomes. These interventions may assist in kidney disease management and overcome these innate challenges that single nutrient interventions possess. Healthy dietary patterns are typically plant-based and lower in sodium and animal proteins. These patterns may have numerous mechanistic benefits for cardiovascular health in kidney disease, most notably through the increase in fruit, vegetables, and plant-based protein, as well as improved gut health through the increase in dietary fiber. The evidence to date on optimal dietary patterns points toward use of a predominantly plant-based diet, and suggests its adoption may improve clinical outcomes in dialysis patients. However, clinical trials are needed to determine whether these diet interventions are feasible, safe, and effective in this patient population.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 11-03-2021
Publisher: Elsevier BV
Date: 03-2020
Publisher: SAGE Publications
Date: 11-2001
DOI: 10.1177/089686080002000623
Abstract: To determine the influence of an elevated body mass index (BMI) on cardiovascular outcomes and survival in peritoneal dialysis (PD) patients. Prospective, observational study of a prevalent PD cohort at a single center. Tertiary care institutional dialysis center. The study included all patients with a BMI of at least 20 who had been receiving PD for at least 1 month as of 31 January 1996 ( n = 43). Patients were classified as overweight [BMI 27.5 mean ± standard error of mean (SEM): 32.1 ± 1.1 n = 14] or normal weight (BMI 20 – 27.5 mean ± SEM: 23.8 ± 0.4 n = 29). Patient survival and adverse cardiovascular events (myocardial infarction, congestive cardiac failure, cerebrovascular accident, and symptomatic peripheral vascular disease) were recorded over a 3-year period. At baseline, no significant differences were seen between the groups in clinical, biochemical, nutritional, or echocardiographic parameters, except for a lower dietary protein intake (0.97 ± 0.10 g/kg/day vs 1.44 ± 0.10 g/ kg/day, p = 0.004) and a higher proportion of well-nourished patients by subjective global assessment (100% vs 72%, p 0.05) in the overweight group. After 3 years of follow-up, 29% of overweight patients and 69% of normal-weight patients had died ( p 0.05). Using a Cox proportional hazards model, a BMI greater than 27.5 was shown to be an independent positive predictor of patient survival, with an adjusted hazard ratio (HR) of 0.09 [95% confidence interval (CI): 0.01 – 0.85 p 0.05]. However, being overweight did not significantly influence myocardial infarction-free survival (adjusted HR: 0.33 95% CI: 0.07 – 1.48 p = 0.15) or combined adverse cardiovascular event-free survival (adjusted HR: 0.67 95% CI: 0.23 – 1.93 p = 0.46). Obesity conferred a significant survival advantage in our PD population. Obese patients should therefore not be discouraged from receiving PD purely on the basis of BMI. Moreover, maintaining a higher-than-average BMI to preserve “nutritional reserve” may help to reduce the mortality and morbidity rates associated with PD.
Publisher: Elsevier BV
Date: 09-2020
Publisher: Springer Science and Business Media LLC
Date: 22-05-2017
Publisher: Wiley
Date: 03-2009
DOI: 10.1111/J.1440-1797.2008.01027.X
Abstract: Dysfunction in apoptosis plays a role in development of renal cell carcinoma (RCC). This investigation aimed to identify expression of apoptosis-related genes not previously characterized in human RCC. The RCC ACHN cell line was treated with radiation plus interferon-alpha to induce significant apoptosis. Apoptosis RNA microarrays were used to compare control and treated RCC for apoptosis-regulatory genes with significantly altered expression (>or= twofold). Translational correlates were analysed using western blot. Immunohistochemistry of human RCC and non-cancerous kidney in tissue microarrays was also completed. Several gene families, not well characterized in RCC, were significantly upregulated in RNA microarray. These were the tumour necrosis factor receptor-associated factors (TRAF1, 3 and 4), caspase recruitment domain (NOL3 and PYCARD), and cell death-inducing DFF-45 effector domain (ICAD/CAD) genes. The protein expression patterns did not always increase similarly, perhaps indicating some post-transcriptional controls needing further investigation. TRAF1 had significantly increased expression for RNA and protein (P<0.01). NOL3 had significantly decreased whole-cell protein expression (P<0.05), but had strongly localized nuclear positivity in RCC in the immunohistochemistry. These newly identified RCC apoptosis genes have shown potential for improving outcome in other cancers and may prove to have the same potential in RCC with further study.
Publisher: Wiley
Date: 14-03-2011
DOI: 10.1002/PATH.2862
Abstract: Erythropoietin (EPO) is a cytokine hormone with cytoprotective effects in many tissues including the brain. Although the benefits of administration of recombinant human EPO (rhEPO) for neonatal hypoxic brain injury have been demonstrated in neuronal tissue, the effect on non-neuronal cell populations is unclear. We tested the hypothesis that rhEPO would not only protect neuronal cells but also glial cells at a stage of brain development where their maturation was particularly sensitive, and also protect the vasculature. This was evaluated in a rat model of hypoxic injury. 1000 IU/kg rhEPO was delivered intraperitoneally at the start of 4 h hypoxia or normoxia. Treatment groups of neonatal rats (day of birth, at least N = 10 per group) were as follows: normoxia normoxia plus rhEPO hypoxia (8% FiO(2) delivered in temperature-controlled chambers) and hypoxia plus rhEPO. Day of birth in rats is equivalent to human gestation of 28-32 weeks. The effects of rhEPO administration, especially to non-neuronal cell populations, and the associated molecular pathways, were investigated. Apoptosis was increased with hypoxia and this was significantly reduced with rhEPO (p < 0.05). The neuronal marker, microtubule-associated protein-2, increased in expression (p < 0.05) when apoptosis was significantly reduced by rhEPO. In addition, compared with hypoxia alone, rhEPO-treated hypoxia had the following significant protein expression increases (p < 0.05): the intermediate filament structural protein nestin myelin basic protein (oligodendrocytes) and glial fibrillary acidic protein (astrocytes). In conclusion, rhEPO protects the developing brain via anti-apoptotic mechanisms and promotes the health of non-neuronal as well as neuronal cell populations at a time when loss of these cells would have long-lasting effects on brain function.
Publisher: Elsevier BV
Date: 05-2021
Publisher: Wiley
Date: 31-07-2019
DOI: 10.1111/NEP.13635
Abstract: Protease-activated receptor 2 (PAR2) has been implicated in the development of renal inflammation and fibrosis. In particular, activation of PAR2 in cultured tubular epithelial cells induces extracellular signal-regulated kinase signalling and secretion of fibronectin, C-C Motif Chemokine Ligand 2 (CCL2) and transforming growth factor-β1 (TGF-β1), suggesting a role in tubulointerstitial inflammation and fibrosis. We tested this hypothesis in unilateral ureteric obstruction (UUO) in which ongoing tubular epithelial cell damage drives tubulointerstitial inflammation and fibrosis. Unilateral ureteric obstruction surgery was performed in groups (n = 9/10) of Par2-/- and wild type (WT) littermate mice which were killed 7 days later. Non-experimental mice were controls. Wild type mice exhibited a 5-fold increase in Par2 messenger RNA (mRNA) levels in the UUO kidney. In situ hybridization localized Par2 mRNA expression to tubular epithelial cells in normal kidney, with a marked increase in Par2 mRNA expression by tubular cells, including damaged tubular cells, in WT UUO kidney. Tubular damage (tubular dilation, increased KIM-1 and decreased α-Klotho expression) and tubular signalling (extracellular signal-regulated kinase phosphorylation) seen in WT UUO were not altered in Par2-/- UUO. In addition, macrophage infiltration, up-regulation of M1 (NOS2) and M2 (CD206) macrophage markers, and up-regulation of pro-inflammatory molecules (tumour necrosis factor, CCL2, interleukin-36α) in WT UUO kidney were unchanged in Par2-/- UUO. Finally, the accumulation of α-SMA+ myofibroblasts, deposition of collagen IV and expression of pro-fibrotic factors (CTGF, TGF-β1) were not different between WT and Par2-/- UUO mice. Protease-activated receptor 2 expression is substantially up-regulated in tubular epithelial cells in the obstructed kidney, but this does not contribute to the development of tubular damage, renal inflammation or fibrosis.
Publisher: Frontiers Media SA
Date: 10-03-2021
DOI: 10.3389/FPHYS.2021.615428
Abstract: Coagulation abnormalities and increased risk of atherothrombosis are common in patients with chronic kidney diseases (CKD). Mechanisms that alter renal hemostasis and lead to thrombotic events are not fully understood. Here we show that activation of protease activated receptor-2 (PAR2) on human kidney tubular epithelial cells (HTECs), induces tissue factor (TF) synthesis and secretion that enhances blood clotting. PAR-activating coagulation-associated protease (thrombin), as well as specific PAR2 activators (matriptase, trypsin, or synthetic agonist 2f-LIGRLO-NH 2 (2F), induced TF synthesis and secretion that were potently inhibited by PAR2 antagonist, I-191. Thrombin-induced TF was also inhibited by a PAR1 antagonist, Vorapaxar. Peptide activators of PAR1, PAR3, and PAR4 failed to induce TF synthesis. Differential centrifugation of the 2F-conditoned medium sedimented the secreted TF, together with the exosome marker ALG-2 interacting protein X (ALIX), indicating that secreted TF was associated with extracellular vesicles. 2F-treated HTEC conditioned medium significantly enhanced blood clotting, which was prevented by pre-incubating this medium with an antibody for TF. In summary, activation of PAR2 on HTEC stimulates synthesis and secretion of TF that induces blood clotting, and this is attenuated by PAR2 antagonism. Thrombin-induced TF synthesis is at least partly mediated by PAR1 transactivation of PAR2. These findings reveal how underlying hemostatic imbalances might increase thrombosis risk and subsequent chronic fibrin deposition in the kidneys of patients with CKD and suggest PAR2 antagonism as a potential therapeutic strategy for intervening in CKD progression.
Publisher: Elsevier BV
Date: 05-2021
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 23-06-2016
Abstract: Technical innovations in peritoneal dialysis (PD), now used widely for the long-term treatment of ESRD, have significantly reduced therapy-related complications, allowing patients to be maintained on PD for longer periods. Indeed, the survival rate for patients treated with PD is now equivalent to that with in-center hemodialysis. In parallel, changes in public policy have spurred an unprecedented expansion in the use of PD in many parts of the world. Meanwhile, our improved understanding of the molecular mechanisms involved in solute and water transport across the peritoneum and of the pathobiology of structural and functional changes in the peritoneum with long-term PD has provided new targets for improving efficiency and for intervention. As with hemodialysis, almost half of all deaths on PD occur because of cardiovascular events, and there is great interest in identifying modality-specific factors contributing to these events. Notably, tremendous progress has been made in developing interventions that substantially reduce the risk of PD-related peritonitis. Yet the gains have been unequal among in idual centers, primarily because of unequal clinical application of knowledge gained from research. The work to date has further highlighted the areas in need of innovation as we continue to strive to improve the health and outcomes of patients treated with PD.
Publisher: John Wiley & Sons, Ltd
Date: 21-01-2009
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 09-2013
DOI: 10.2215/CJN.12361212
Abstract: The effect of biocompatible peritoneal dialysis (PD) solutions on PD-related peritonitis is unclear. This study sought to evaluate the relationship between use of biocompatible solutions and the probability of occurrence or clinical outcomes of peritonitis. The study included all incident Australian patients receiving PD between January 1, 2007, and December 31, 2010, using Australia and New Zealand Dialysis and Transplant Registry data. All multicompartment PD solutions of neutral pH were categorized as biocompatible solutions. The independent predictors of peritonitis and the use of biocompatible solutions were determined by multivariable, multilevel mixed-effects Poisson and logistic regression analysis, respectively. Sensitivity analyses, including propensity score matching, were performed. Use of biocompatible solutions gradually declined (from 7.5% in 2007 to 4.2% in 2010), with preferential use among smaller units and among younger patients without diabetes mellitus. Treatment with biocompatible solution was associated with significantly greater overall rate of peritonitis (0.67 versus 0.47 episode per patient-year incidence rate ratio, 1.49 95% confidence interval [CI], 1.19 to 1.89) and with shorter time to first peritonitis (hazard ratio [HR], 1.48 95% CI, 1.17 to 1.87), a finding replicated in propensity score–matched cohorts (HR, 1.36 95% CI, 1.09 to 1.71). In an observational registry study, use of biocompatible PD solutions was associated with higher overall peritonitis rates and shorter time to first peritonitis. Further randomized studies adequately powered for a primary peritonitis outcome are warranted.
Publisher: Oxford University Press (OUP)
Date: 1996
Publisher: Wiley
Date: 28-12-2011
DOI: 10.1111/J.1440-1797.2011.01520.X
Abstract: Uraemia is associated with hyperprolactinaemia, low total (TT) and free (FT) serum testosterone, high luteinizing hormone (LH) and follicle-stimulating hormone (FSH) and, in women, anovulatory cycles and premature menopause. We hypothesize that extended hours haemodialysis may improve these derangements. This is an observational cohort study of 30 men (age 54±13 years, body mass index (BMI) 28.1±5.8 kg/m2) and seven women (age 41±11 years, BMI 32.2±11.2 kg/m2) established on chronic home haemodialysis (3-5 h, 3.5-5 sessions weekly) who were converted to nocturnal home haemodialysis (6-9 h, 3.5-5 sessions weekly). Serum was collected at baseline and 6 months for measurement of TT, sex hormone binding globulin (SHBG), LH, FSH, prolactin, thyroid-stimulating hormone and thyroxine. In the male patients (n=25), serum prolactin significantly fell (281 (209.5-520) vs 243 (187-359) mU/L, P=0.001) and TT (12.6±5.8 vs 15.2±8.1 nmol/L, P=0.06) and FT (281±118 vs 359±221 pmol/L, P=0.01) increased. SHBG, LH and FSH were unchanged. At 6 months, two of the three women under 40 years of age had return of regular menses after being amenorrhoeic or having prolonged and irregular menses at baseline. There were insufficient women in this study to further analyse changes in sex hormone levels. Thyroid function tests remained stable. Alternate nightly nocturnal haemodialysis significantly improves hyperprolactinaemia and hypotestosteronaemia in men. Menstrual cycling may be re-established in young women. The effect of these changes on fertility has not been established. Patients should be counselled about the possibility of increased fertility before conversion to extended hours haemodialysis regimens.
Publisher: Wiley
Date: 02-2007
DOI: 10.1111/J.1440-1797.2006.00746.X
Abstract: The protease-activated receptor-2 (PAR-2), the second of four members of a unique subfamily of G-protein coupled receptors, is abundantly expressed in the kidney. In a similar manner to other PAR cleavage of its extracellular N-terminus exposes a tethered ligand, SLIGKV in humans, which acts as an intramolecular ligand to activate itself. In the kidney, PAR-2 expression has been variably reported in collecting duct cells, mesangial cells, interstitial fibroblasts, vascular endothelial cells, vascular smooth muscle cells and proximal tubular cells. Despite this renal expression data, the function of PAR-2 in the kidney remains unknown. More than 15 different mammalian serine proteases have been shown to activate PAR-2 in an in vitro setting, but it is still unclear which of these are physiologically relevant activators of PAR-2 in specific tissues. Their identification could provide novel therapeutic targets. PAR-2 activates a number of down-stream signalling molecules that include protein kinase C, extracellular signal regulated kinase and nuclear factor kappa-B. Proteases that can activate PAR-2 are generated and released from cells during injury, inflammation and malignancy and can thus signal to cells under these conditions. Potential physiological and pathophysiological roles for PAR-2 in the kidney include the regulation of inflammation, blood flow, and ion transport and tissue protection, repair and fibrosis. In this review the potential roles of PAR-2 in the kidney are highlighted and discussed.
Publisher: Oxford University Press (OUP)
Date: 02-2004
DOI: 10.1093/NDT/GFG547
Abstract: Erythropoietin (EPO) has recently been shown to exert important cytoprotective and anti-apoptotic effects in experimental brain injury and cisplatin-induced nephrotoxicity. The aim of the present study was to determine whether EPO administration is also renoprotective in both in vitro and in vivo models of ischaemic acute renal failure. Primary cultures of human proximal tubule cells (PTCs) were exposed to either vehicle or EPO (6.25-400 IU/ml) in the presence of hypoxia (1% O(2)), normoxia (21% O(2)) or hypoxia followed by normoxia for up to 24 h. The end-points evaluated included cell apoptosis (morphology and in situ end labelling [ISEL], viability [lactate dehydrogenase (LDH release)], cell proliferation [proliferating cell nuclear antigen (PCNA)] and DNA synthesis (thymidine incorporation). The effects of EPO pre-treatment (5000 U/kg) on renal morphology and function were also studied in rat models of unilateral and bilateral ischaemia-reperfusion (IR) injury. In the in vitro model, hypoxia (1% O(2)) induced a significant degree of PTC apoptosis, which was substantially reduced by co-incubation with EPO at 24 h (vehicle 2.5+/-0.5% vs 25 IU/ml EPO 1.8+/-0.4% vs 200 IU/ml EPO 0.9+/-0.2%, n = 9, P<0.05). At high concentrations (400 IU/ml), EPO also stimulated thymidine incorporation in cells exposed to hypoxia with or without subsequent normoxia. LDH release was not significantly affected. In the unilateral IR model, EPO pre-treatment significantly attenuated outer medullary thick ascending limb (TAL) apoptosis (EPO 2.2+/-1.0% of cells vs vehicle 6.5+/-2.2%, P<0.05, n = 5) and potentiated mitosis (EPO 1.1+/-0.3% vs vehicle 0.5+/-0.3%, respectively, P<0.05) within 24 h. EPO-treated rats exhibited enhanced PCNA staining within the proximal straight tubule (6.9+/-0.7% vs vehicle 2.4+/-0.5% vs sham 0.3+/-0.2%, P<0.05), proximal convoluted tubule (2.3+/-0.6% vs vehicle 1.1+/-0.3% vs sham 1.2+/-0.3%, P<0.05) and TAL (4.7+/-0.9% vs vehicle 0.6+/-0.3% vs sham 0.3+/-0.2%, P<0.05). The frequency of tubular profiles with luminal cast material was also reduced (32.0+/-1.6 vs vehicle 37.0+/-1.3%, P = 0.05). EPO-treated rats subjected to bilateral IR injury exhibited similar histological improvements to the unilateral IR injury model, as well as significantly lower peak plasma creatinine concentrations than their vehicle-treated controls (0.04+/-0.01 vs 0.21+/-0.08 mmol/l, respectively, P<0.05). EPO had no effect on renal function in sham-operated controls. The results suggest that, in addition to its well-known erythropoietic effects, EPO inhibits apoptotic cell death, enhances tubular epithelial regeneration and promotes renal functional recovery in hypoxic or ischaemic acute renal injury.
Publisher: S. Karger AG
Date: 2000
DOI: 10.1159/000045794
Publisher: Elsevier BV
Date: 05-2018
DOI: 10.1016/J.SEMNEPHROL.2018.02.007
Abstract: Cardiovascular disease (CVD) is highly prevalent in the peritoneal dialysis (PD) population, affecting up to 60% of cohorts. CVD is the primary cause of death in up to 40% of PD patients in Australia, New Zealand, and the United States. Cardiovascular mortality rates are reported to be approximately 14 per 100 patient-years, which are 10- to 20-fold greater than those of age- and sex-matched controls. The excess risk of CVD is related to a combination of traditional risk factors (such as hypertension, dyslipidemia, obesity, smoking, sedentary lifestyle, and insulin resistance), nontraditional (kidney disease-related) risk factors (such as anemia, chronic volume overload, inflammation, malnutrition, hyperuricemia, and mineral and bone disorder), and PD-specific risk factors (such as dialysis solutions, glycation end products, hypokalemia, residual kidney function, and ultrafiltration failure). Interventions targeting these factors may mitigate cardiovascular risk, although high-level clinical evidence is lacking. This review summarizes the evidence relating to cardiovascular interventions targeting modifiable CVD risk factors in PD patients, as well as highlighting the key recommendations of the International Society for Peritoneal Dialysis Cardiovascular and Metabolic Guidelines.
Publisher: Frontiers Media SA
Date: 07-2003
Publisher: Wiley
Date: 05-02-2021
DOI: 10.1111/NEP.13853
Publisher: Elsevier BV
Date: 02-2019
DOI: 10.1016/J.KINT.2018.10.018
Abstract: The Kidney Disease: Improving Global Outcomes (KDIGO) initiative organized a Controversies Conference on glomerular diseases in November 2017. The conference focused on the 2012 KDIGO guideline with the aim of identifying new insights into nomenclature, pathogenesis, diagnostic work-up, and, in particular, therapy of glomerular diseases since the guideline's publication. It was the consensus of the group that most guideline recommendations, in particular those dealing with therapy, will need to be revisited by the guideline-updating Work Group. This report covers general management of glomerular disease, IgA nephropathy, and membranous nephropathy.
Publisher: Elsevier BV
Date: 03-2021
Publisher: SAGE Publications
Date: 09-2012
Abstract: A multicenter, multi-country randomized controlled trial (the balANZ study) recently reported that peritonitis rates significantly improved with the use of neutral-pH peritoneal dialysis (PD) solutions low in glucose degradation products (“biocompatible”) compared with standard solutions. The present paper reports a secondary outcome analysis of the balANZ trial with respect to peritonitis microbiology, treatment, and outcomes. Adult incident PD patients with residual renal function were randomized to receive either biocompatible or conventional (control) PD solutions for 2 years. The safety population analysis for peritonitis included 91 patients in each group. The unadjusted geometric mean peritonitis rates in those groups were 0.30 [95% confidence interval (CI): 0.22 to 0.41] episodes per patient–year for the biocompatible group and 0.49 (95% CI: 0.39 to 0.62) episodes per patient–year for the control group [incidence rate ratio (IRR): 0.61 95% CI: 0.41 to 0.90 p = 0.01]. When specific causative organisms were examined, the rates of culture-negative, gram-positive, gram-negative, and polymicrobial peritonitis episodes were not significantly different between the biocompatible and control groups, although the biocompatible group did experience a significantly lower rate of non-pseudomonal gram-negative peritonitis (IRR: 0.41 95% CI: 0.18 to 0.92 p = 0.03). Initial empiric antibiotic regimens were comparable between the groups. Biocompatible fluid use did not significantly reduce the risk of peritonitis-associated hospitalization (adjusted odds ratio: 0.80 95% CI: 0.48 to 1.34), but did result in a shorter median duration of peritonitis-associated hospitalization (6 days vs 11 days, p = 0.05). Peritonitis severity was more likely to be rated as mild in the biocompatible group (37% vs 10%, p = 0.001). Overall peritonitis-associated technique failures and peritonitis-related deaths were comparable in the two groups. Biocompatible PD fluid use was associated with a broad reduction in gram-positive, gram-negative, and culture-negative peritonitis that reached statistical significance for non-pseudomonal gram-negative organisms. Peritonitis hospitalization duration was shorter, and peritonitis severity was more commonly rated as mild in patients receiving biocompatible PD fluids, although other peritonitis outcomes were comparable between the groups.
Publisher: Elsevier BV
Date: 04-2020
Publisher: Wiley
Date: 17-12-2013
DOI: 10.1111/J.1440-1797.2012.01662.X
Abstract: To assess the impact of vitamin D supplementation (cholecalciferol) on the insulin sensitivity and metabolic health of patients with chronic kidney disease (CKD). Twenty-eight adult patients with CKD stages 3-4 were recruited from the outpatient department of the Princess Alexandra Hospital (Brisbane, Australia) to a double-blind randomized trial of cholecalciferol (vitamin D3) 2000 IU/day or placebo for 6 months. Metabolic parameters at baseline were compared with 20 non-CKD adults. The primary outcome was an improvement in insulin resistance (glucose disposal rate, GDR) at 6 months (quantified by hyperinsulinaemic euglycaemic cl ). Carbohydrate and lipid oxidation rates were assessed by indirect calorimetry. At baseline, patients were significantly insulin-resistant compared with lean younger non-CKD in iduals (n = 9 GDR 3.42 vs. 5.76 mg/kg per minute, P = 0.001), but comparable with their age-, gender- and weight-matched non-CKD counterparts (n = 11 3.42 vs. 3.98 mg/kg per minute, P = 0.4). 25-Hydroxyvitamin D did not change in the placebo group, but rose from 95 ± 37 to 146 ± 25 nmol/L with treatment (P = 0.0001). Post treatment, there was no difference in GDR between groups (GDR 3.38 vs. 3.52 mg/kg per minute, ancova P = 0.4). There was a relative increase in hyperinsulinaemic oxidative disposal of glucose with treatment (within-group P = 0.03). Supplementation with cholecalciferol in CKD 3-4 results in appreciable increases in 25-hydroxyvitamin D concentrations, but does not increase insulin sensitivity. The insulin resistance observed was similar among age-, sex- and body mass index-matched in iduals with and without CKD. Whether renal dysfunction per se has any influence on the insulin sensitivity of an in idual should be the subject of future work.
Publisher: Elsevier BV
Date: 03-2021
Publisher: Elsevier BV
Date: 10-2020
Publisher: Springer Science and Business Media LLC
Date: 17-12-2022
Publisher: Elsevier BV
Date: 03-2020
Publisher: Informa UK Limited
Date: 12-2007
Abstract: Anaemia which develops as a consequence of malignancies is often treated using recombinant human erythropoietin (rhEpo). Epo is now known as an anti-apoptotic factor for a wide range of cell types that express Epo receptors (EpoRs) and its co-use with cancer therapies can act detrimentally to diminish therapy-induced apoptosis. This had not been analyzed for renal cell carcinomas (RCCs). We examined the influence of rhEPO on the ability of cisplatin to induce apoptosis in RCCs. Two RCC cell lines (SN12K1 and ACHN) were compared with a non-RCC renal epithelial cell line (HK2). Cells were treated with 50 microM cisplatin with and without 200 IU/mL rhEpo and were compared for apoptosis, mitosis and protein expression of EpoR, nuclear factor-kappaB (NFkappaB), protein kinase C (PKC), Bcl-2, Bax and cyclin-D1. Experiments were repeated with PKC promotion (PMA, 20 nM) or inhibition (H7, 10 microM). rhEpo reduced cisplatin-induced apoptosis in RCCs (p < 0.01), compared with HK-2s. EpoR expression was increased only in SN12K1 with rhEpo, with and without cisplatin. NFkappaB, Bax and Bcl-2 expression was unchanged. PKC protein expression was significantly reduced in cisplatin-treated RCCs with rhEpo, correlating with reduced apoptosis. When the PKC pathway was inhibited in these cells, levels o apoptosis returned to normal for cisplatin treatment, indicating activation of the PKC pathway by rhEpo. PMA promotion increased mitosis only in the RCCs, with and without rhEpo (p < 0.05). In summary, rhEPO reduced cisplatin-induced apoptosis of RCCs and promoted their mitosis via PKC-dependent pathways. This information indicates caution for use of rhEpo in RCC patients for anemias.
Publisher: American Physiological Society
Date: 15-03-2014
DOI: 10.1152/AJPRENAL.00241.2013
Abstract: Treatment of renal ischemia-reperfusion (IR) injury with recombinant human erythropoietin (rhEPO) reduces acute kidney injury and improves function. We aimed to investigate whether progression to chronic kidney disease associated with acute injury was also reduced by rhEPO treatment, using in vivo and in vitro models. Rats were subjected to bilateral 40-min renal ischemia, and kidneys were studied at 4, 7, and 28 days postreperfusion for renal function, tubular injury and repair, inflammation, and fibrosis. Acute injury was modulated using rhEPO (1,000 or 5,000 IU/kg, intraperitoneally) at the time of reperfusion. Renal tubular epithelial cells or fibroblasts in culture were subjected to hypoxia or oxidative stress, with or without rhEPO (200 IU/ml), and fibrogenesis was studied. The results of the in vivo model confirmed functional and structural improvement with rhEPO at 4 days post-IR ( P 0.05). At 7 days post-IR, fibrosis and myofibroblast stimulation were increased with IR with and without rhEPO ( P 0.01). However, at 28 days post-IR, renal fibrosis and myofibroblast numbers were significantly greater with IR plus rhEPO ( P 0.01) compared with IR only. Mechanistically, rhEPO stimulated profibrotic transforming growth factor-β, oxidative stress (marker 8-hydroxy-deoxyguanosine), and phosphorylation of the signal transduction protein extracellular signal-regulated kinase. In vitro, rhEPO protected tubular epithelium from apoptosis but stimulated epithelial-to-mesenchymal transition and also protected and activated fibroblasts, particularly with oxidative stress. In summary, although rhEPO was protective of renal function and structure in acute kidney injury, the supraphysiological dose needed for renoprotection contributed to fibrogenesis and stimulated chronic kidney disease in the long term.
Publisher: Public Library of Science (PLoS)
Date: 23-08-2019
Publisher: Oxford University Press (OUP)
Date: 19-07-2005
DOI: 10.1093/NDT/GFH987
Abstract: Cardiovascular disease is the major cause of death in the end-stage renal disease population. Novel risk factors such as homocysteine (Hcy) are of considerable interest in this group as hyperhomocysteinaemia is highly prevalent in the setting of renal impairment. Folic acid-vitamin B group therapies are only partially effective treatments. Hcy is highly protein-bound and thus poorly dialysed. Dialyzers with albumin-leaking properties have been shown to result in lowering of plasma Hcy. As the FX-class (Advanced Fresenius Polysulfone dialyzer) has greater clearance of larger molecular weight substances but is non-albumin-leaking, we explored the capacity of this new technology membrane to reduce plasma Hcy levels. A prospective randomized cross-over trial in 35 prevalent haemodialysis patients, one group receiving 12 weeks dialysis using FX dialyzer then 12 weeks with standard high flux dialysis (SHF) and the other group SHF followed by FX dialyzer. All patients received vitamin B(6) 25 mg and folic acid 5 mg daily throughout the study. The primary outcome was plasma Hcy pre-dialysis at week 12. FX vs SHF showed no significant difference, 25+/-6.6 vs 25.9+/-5.8 microg/l, Delta95% CI = -2.77 to 4.59, P = 0.31. There was a non-significant trend toward a decrease in Hcy in both groups (27.43+/-7.68 to 25.91+/-5.78 micromol/l for SHF, P = 0.23 and 26.0+/-4.58 to 25.0+/-6.61 micromol/l for FX, P = 0.28). Analysis by repeated measures method demonstrated a statistically significantly lower Hcy with FX vs SHF dialyzer (adjusted beta = -1.30, 95% CI = -2.41 to -0.19, P = 0.022). K(t)/V(urea) was higher in FX vs SHF (1.35+/-0.18 vs 1.22+/-0.2 P = 0.013). Folate and B(6) levels did not change. The primary outcome analysis did not show any significant difference in pre-Hcy comparing FX and SHF membranes. Although our secondary analysis demonstrated a statistically significant difference between membranes, the magnitude of the difference (1.3 mumol/l) is not clinically significant. Thus the use of the FX dialyzer did not result in a clinically significant benefit in relation to improving pre-dialysis Hcy compared with standard high-flux dialysis.
Publisher: Elsevier BV
Date: 10-2023
Publisher: Wiley
Date: 10-2008
DOI: 10.1111/J.1440-1797.2008.01039.X
Abstract: While deceased donor kidney transplantation rates have remained stagnant, live donor kidney transplantation (LDKT) rates have increased significantly over the last decade, and are now a major component of renal transplantation programmes worldwide. Additionally, there has been an increased utilization of more marginal donors, including donors who are obese, older and subjects with well-controlled hypertension. A retrospective audit of all live donors at the Princess Alexandra Hospital Renal Transplantation unit was performed from 24 August 1982 to 29 May 2007 to assess any change in donor characteristics over time. There were 373 live donor operations. Over the last 25 years there has been a significant increase in the number of donors who are either older or obese. Furthermore, there is a greater proportion of spousal and emotionally related LDKT. It is imperative that donors, in particular marginal donors, are followed up long-term to determine their risk of kidney and cardiovascular disease and initiation of appropriate treatment if required.
Publisher: Elsevier BV
Date: 12-2018
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 30-04-2020
DOI: 10.2215/CJN.13101019
Abstract: Shared decision making in patients with glomerular disease remains challenging because outcomes important to patients remain largely unknown. We aimed to identify and prioritize outcomes important to patients and caregivers and to describe reasons for their choices. We purposively s led adult patients with glomerular disease and their caregivers from Australia, Hong Kong, the United Kingdom, and the United States. Participants identified, discussed, and ranked outcomes in focus groups using the nominal group technique a relative importance score (between zero and one) was calculated. Qualitative data were analyzed thematically. Across 16 focus groups, 134 participants (range, 19–85 years old 51% women), including 101 patients and 33 caregivers, identified 58 outcomes. The ten highest-ranked outcomes were kidney function (importance score of 0.42), mortality (0.29), need for dialysis or transplant (0.22), life participation (0.18), fatigue (0.17), anxiety (0.13), family impact (0.12), infection and immunity (0.12), ability to work (0.11), and BP (0.11). Three themes explained the reasons for these rankings: constraining day-to-day experience, impaired agency and control over health, and threats to future health and family. Patients with glomerular disease and their caregivers highly prioritize kidney health and survival, but they also prioritize life participation, fatigue, anxiety, and family impact.
Publisher: Oxford University Press (OUP)
Date: 12-07-2005
DOI: 10.1093/NDT/GFH980
Abstract: Cardiac events (CE cardiac death, non-fatal myocardial infarction and acute coronary syndrome) are the principal causes of death in patients with chronic kidney disease (CKD). We sought to devise and validate a cardiac risk score to risk-stratify patients with CKD. Clinical history and biochemical data were obtained in 167 CKD patients. CE were recorded over a median follow-up of 22 months. The hazard ratio (HR) of each independent variable using Cox regression analysis was used to derive a cardiac risk score for the prediction of events. The cardiac risk score was then applied to a validation population of 99 CKD patients to confirm its validity in predicting CE. CE occurred in 20 patients in the derivation group. The independent predictors of CE were cardiac history (HR 9.83, P = 0.001), body mass index (BMI HR 1.15, P = 0.002), dialysis duration (HR 1.24, P = 0.004) and serum phosphate (HR 4.29, P = 0.001). The resulting cardiac risk score (range 26-67) gave an area under the receiver operating characteristic curve of 0.86. CE occurred in 25 patients in the validation group the ROC curve area was similar (0.84, P = 0.11). An optimal cardiac risk score cut-off of 50 assigned high risk to 29% of the derivation and 35% of the validation group (P = 0.26). CE occurred in 35 and 57% of the high-risk derivation and validation groups, respectively (P = 0.09), and in 2 and 8% of the low-risk groups (P = 0.15). Application of a cardiac risk score using cardiac history, dialysis duration, BMI and phosphate identifies CKD patients at risk of future CE.
Publisher: Springer Science and Business Media LLC
Date: 15-04-2010
DOI: 10.1007/S00467-010-1510-5
Abstract: Peritonitis is a common complication and major cause of morbidity in children on peritoneal dialysis. In this retrospective longitudinal study, we analysed data retrieved from the Australian and New Zealand Dialysis and Transplant Registry (ANZDATA) on 167 patients aged less than 18 years of age who were treated with peritoneal dialysis during the period from October 2003 to December 2007. During this period there were 100 episodes of peritonitis in 57 patients (0.71 episodes atient-year), with Gram-positive organisms most commonly isolated (44%). Peritonitis occurred frequently in the first 6 months after starting dialysis, with survival analysis showing peritonitis-free survival rates of 72%, 56% and 36% at 6 months, 1 year and 2 years respectively. Age was a weak predictor of peritonitis on univariate analysis, but previous peritonitis was the only significant predictor in a multivariate Cox proportional hazards model (adjusted hazard ratio 2.02 95% CI: 1.20 to 3.40, p = 0.008). Peritonitis episodes infrequently resulted in relapse (5%), recurrence (7%) or the need for either temporary or permanent haemodialysis (5% and 7% respectively) and there were no patient deaths directly attributable to peritonitis. Compared with single organism peritonitis, polymicrobial peritonitis was not associated with any statistically significant differences in outcome. Further prospective studies are required to determine the most appropriate prophylactic measures and antibiotic regimens for use in pediatric patients.
Publisher: SAGE Publications
Date: 22-04-2021
Abstract: A differential association between mortality and cause of end-stage kidney disease in patients with type 2 diabetes mellitus (T2DM) has been shown. Sex-specific differences in diabetes-related complications have been described. It is unclear whether sex affects the associations between diabetes and peritoneal dialysis (PD) technique and patient survival. Using the Australia and New Zealand Dialysis and Transplant Registry, we examined a two-way interaction between sex and diabetes status (no diabetes, T2DM and non-diabetic nephropathy [T2DM + non-DN] and T2DM and diabetic nephropathy [T2DM + DN]) for PD technique failure (including death), all-cause mortality and cause-specific mortality in incident adult PD patients between 1996 and 2016 using adjusted Cox regression. Mediation analysis was conducted to determine whether peritonitis was a mediator in these associations. In 8279 PD patients, those with T2DM + DN had the greatest risks in technique failure, all-cause mortality and cause-specific mortality followed by patients with T2DM + non-DN, then patients without diabetes. Sex modified the association with diabetes status in technique failure ( p interaction = 0.001) and cardiac mortality ( p interaction = 0.008). In women with T2DM + DN, the adjusted hazard ratio (HR) for technique failure was 1.45 (1.30–1.62) and was higher than men with T2DM + DN (1.17 [1.08–1.28] referent: no diabetes). In women with T2DM + DN, the adjusted HR for cardiac mortality was 2.12 (1.73–2.61) and was also higher than men with T2DM + DN (1.66 [1.43–1.95]). Less than 10 % of the effect between diabetes and PD technique failure or mortality was mediated by peritonitis. PD patients with diabetic nephropathy had increased risk of PD technique failure and mortality, with the magnitude of these risks greater in women.
Publisher: SAGE Publications
Date: 03-2018
Abstract: Evidence of effective interventions to prevent peritoneal dialysis (PD) catheter malfunction before first use is presently insufficient to guide clinical care. Regular flushing of the PD catheter (e.g. before PD commencement) has been adopted by some practitioners in the belief that it will prevent catheter obstruction and/or malfunction. The aim of this study was to characterize and evaluate PD catheter flushing practices across Australian and New Zealand PD units. An on-line survey was distributed to all 62 PD units in Australia (12 August 2016 n = 51) and New Zealand (2 February 2017 n = 11), with questions relating to PD catheter flushing practices, audit, and outcomes. Forty-nine units of variable size ( 16 to 100 patients) completed the survey (79% response rate). All centers flushed PD catheters at some stage after insertion as routine unit practice. Forty-one units (84%) routinely flushed during periods of PD rest at varying intervals ranging from alternate daily to monthly. The type and volume of solution used to flush varied between units. Units that practised routine flushing of PD catheters were almost twice as likely to audit their catheter-related outcomes (66% vs 38%, p = 0.23) and more likely to have reported blocked catheters in the preceding 12 months (84% vs 0%, p = 0.01) compared with those units that did not routinely flush PD catheters. Thirty units (61%) regularly audited and monitored catheter-related outcomes. This study identified a wide variation in center practices relating to PD catheter flushing. Drawing conclusions about any relationship between flushing practices and clinical outcomes was impeded by the relatively low uptake of regular auditing and monitoring of catheter-related outcomes across surveyed units. Evaluation of the benefits and harms of standardized PD catheter flushing practices on patient outcomes in a randomized trial is needed to guide practice.
Publisher: Informa UK Limited
Date: 27-08-2015
DOI: 10.1586/17446651.2015.1079124
Abstract: Chronic kidney disease is associated with an accelerated risk of cardiovascular (CV) mortality. Seminal work over the last decade has identified abnormal bone metabolism as an important modulator of the increased CV burden in this cohort. In particular, FGF23, a phosphaturic hormone with serum levels found to be markedly elevated in chronic kidney disease, is independently associated with increased risks of all-cause mortality and CV events. This editorial will discuss the proposed mechanisms linking FGF23 to CV disease in chronic kidney disease, namely, direct cardiac myocyte toxicity, endothelial dysfunction and vascular calcification.
Publisher: Oxford University Press (OUP)
Date: 11-09-2009
DOI: 10.1093/NDT/GFP466
Publisher: SAGE Publications
Date: 2019
Abstract: Patients with end-stage kidney disease (ESKD) have different options to replace the function of their failing kidneys. The “integrated care” model considers treatment pathways rather than in idual renal replacement therapy (RRT) techniques. In such a paradigm, the optimal strategy to plan and enact transitions between the different modalities is very relevant, but so far, only limited data on transitions have been published. Perspectives of patients, care-givers, and health professionals on the process of transitioning are even less well documented. Available literature suggests that poor coordination causes significant morbidity and mortality. This review briefly provides the background, development, and scope of the INTErnational Group Research Assessing Transition Effects in Dialysis (INTEGRATED) initiative. We summarize the literature on the transition between different RRT modalities. Further, we present an international research plan to quantify the epidemiology and to assess the qualitative aspects of transition between different modalities.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 31-01-2019
DOI: 10.2215/CJN.08580718
Abstract: Higher fruit and vegetable intake is associated with lower cardiovascular and all-cause mortality in the general population. It is unclear whether this association occurs in patients on hemodialysis, in whom high fruit and vegetable intake is generally discouraged because of a potential risk of hyperkalemia. We aimed to evaluate the association between fruit and vegetable intake and mortality in hemodialysis. Fruit and vegetable intake was ascertained by the Global Allergy and Asthma European Network food frequency questionnaire within the Dietary Intake, Death and Hospitalization in Adults with ESKD Treated with Hemodialysis study, a multinational cohort study of 9757 adults on hemodialysis, of whom 8078 (83%) had analyzable dietary data. Adjusted Cox regression analyses clustered by country were conducted to evaluate the association between tertiles of fruit and vegetable intake with all-cause, cardiovascular, and noncardiovascular mortality. Estimates were calculated as hazard ratios with 95% confidence intervals (95% CIs). During a median follow up of 2.7 years (18,586 person-years), there were 2082 deaths (954 cardiovascular). The median (interquartile range) number of servings of fruit and vegetables was 8 (4–14) per week only 4% of the study population consumed at least four servings per day as recommended in the general population. Compared with the lowest tertile of servings per week (0–5.5, median 2), the adjusted hazard ratios for the middle (5.6–10, median 8) and highest ( , median 17) tertiles were 0.90 (95% CI, 0.81 to 1.00) and 0.80 (95% CI, 0.71 to 0.91) for all-cause mortality, 0.88 (95% CI, 0.76 to 1.02) and 0.77 (95% CI, 0.66 to 0.91) for noncardiovascular mortality and 0.95 (95% CI, 0.81 to 1.11) and 0.84 (95% CI, 0.70 to 1.00) for cardiovascular mortality, respectively. Fruit and vegetable intake in the hemodialysis population is low and a higher consumption is associated with lower all-cause and noncardiovascular death.
Publisher: Elsevier BV
Date: 11-2010
DOI: 10.1016/J.AHJ.2010.08.012
Abstract: Lowering low-density lipoprotein (LDL) cholesterol with statin therapy has been shown to reduce the incidence of atherosclerotic events in many types of patient, but it remains uncertain whether it is of net benefit among people with chronic kidney disease (CKD). Patients with advanced CKD (blood creatinine ≥ 1.7 mg/dL [≥ 150 μmol/L] in men or ≥ 1.5 mg/dL [ ≥ 130 μmol/L] in women) with no known history of myocardial infarction or coronary revascularization were randomized in a ratio of 4:4:1 to ezetimibe 10 mg plus simvastatin 20 mg daily versus matching placebo versus simvastatin 20 mg daily (with the latter arm rerandomized at 1 year to ezetimibe 10 mg plus simvastatin 20 mg daily vs placebo). The key outcome will be major atherosclerotic events, defined as the combination of myocardial infarction, coronary death, ischemic stroke, or any revascularization procedure. A total of 9,438 CKD patients were randomized, of whom 3,056 were on dialysis. Mean age was 61 years, two thirds were male, one fifth had diabetes mellitus, and one sixth had vascular disease. Compared with either placebo or simvastatin alone, allocation to ezetimibe plus simvastatin was not associated with any excess of myopathy, hepatic toxicity, or biliary complications during the first year of follow-up. Compared with placebo, allocation to ezetimibe 10 mg plus simvastatin 20 mg daily yielded average LDL cholesterol differences of 43 mg/dL (1.10 mmol/L) at 1 year and 33 mg/dL (0.85 mmol/L) at 2.5 years. Follow-up is scheduled to continue until August 2010, when all patients will have been followed for at least 4 years. SHARP should provide evidence about the efficacy and safety of lowering LDL cholesterol with the combination of ezetimibe and simvastatin among a wide range of patients with CKD.
Publisher: Oxford University Press (OUP)
Date: 17-11-2015
DOI: 10.1093/NDT/GFW031A
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 12-2013
Publisher: Elsevier BV
Date: 2005
DOI: 10.1053/J.ACKD.2004.10.014
Abstract: The prevalence of obesity, defined as a body mass index (BMI) greater than 30 kg/m2 , has more than doubled in many Western countries over the past 2 decades and has become a major public health challenge. This epidemic of obesity in developed countries has been matched closely by alarming increases in the incidence of diabetes mellitus, hypertension, chronic kidney disease (CKD), and cardiovascular disease. However, the exact role that increased body size plays in the development of nephropathy and its subsequent contribution to cardiovascular morbidity and mortality remain unclear. For ex le, whether obesity per se is a risk factor for CKD independent of diabetes mellitus and hypertension is uncertain. Moreover, in patients with end-stage kidney disease, strong evidence suggests that obesity may paradoxically enhance patient survival. This review will focus on the evidence for obesity as an independent risk factor for the development and progression of CKD and as a paradoxical survival factor in patients with end-stage kidney failure. Possible mechanisms underlying these observed associations will be discussed.
Publisher: Frontiers Media SA
Date: 05-2010
DOI: 10.1111/J.1432-2277.2009.01002.X
Abstract: We report the outcomes of renal transplant patients (n = 43) who received grafts from donors (n = 41) with small (<3 cm) renal tumours removed before transplantation covering the period from May 1996 to September 2007. Patient and graft survival were compared with the outcomes of conventional live unrelated transplants (LURTs) (n = 120) and to patient survival on the transplant waiting list for those who did not receive a kidney during this period (n = 153). Patient survival at 1, 3 and 5 years were 92%, 88% and 88% for recipients of tumourectomized kidneys (TKs), 99%, 97% and 97% for LURTs, and 98%, 92% and 74% for dialysis patients waiting for a deceased donor kidney (log rank score 10.4, P = 0.005). One patient experienced a local tumour recurrence at 9 years following transplantation. This patient declined intervention and is currently under active surveillance. Transplantation of tumourectomized kidneys from patients with small, localized, incidentally detected renal tumours results in similar outcomes to conventional LURTs and confers a significant survival advantage for patients who would otherwise be unable to receive a transplant.
Publisher: Wiley
Date: 28-03-2021
DOI: 10.1111/HDI.12924
Abstract: Hemodialysis (HD) with medium cut‐off (MCO) dialyzers may expand molecular clearance, predominantly larger middle molecules (molecular weight 25–60 kDa). However, the impact of MCO dialyzers on long‐term clearance of various other components of the uremic milieu is unknown. The tRial Evaluating Mid cut‐Off Value membrane clearance of Albumin and Light chains in HemoDialysis patients (REMOVAL‐HD) provided an opportunity to assess the effect of MCO dialyzers on protein‐bound uremic toxins and novel markers of mineral metabolism. This exploratory sub‐study of REMOVAL‐HD evaluated changes in protein‐bound solutes (total and free indoxyl sulfate [IS] and p ‐cresyl sulfate [PCS]) and mineral metabolism markers (intact fibroblast growth factor‐23 [iFGF23], fetuin‐A and endogenous calciprotein particles [CPP‐1 and CPP‐2]). Mid‐week, pre‐HD serum s les were collected at baseline and after 12 and 24 weeks of MCO use in stable adult patients. Change from baseline to Week 12 and 24 was estimated using linear mixed effects models. Eighty‐nine participants were studied (mean age 67 ± 15 years, 38% female, 51% diabetic, median urine output 200 ml/24 h). Serum iFGF23 was reduced at Week 12 compared to baseline (−26.8% [95%CI −39.7, −11.1], p = 0.001), which was sustained at Week 24 (−21.7% [95%CI ‐35.7, −4.5], p = 0.012). There was no significant change in serum IS, PCS, fetuin‐A, CPP‐1, or CPP‐2. The use of a MCO dialyzer over 24 weeks was associated with a sustained reduction in FGF23, while other measured components of the uremic milieu were not significantly altered. Further studies are required to determine whether FGF23 reduction is associated with improved patient outcomes.
Publisher: Elsevier BV
Date: 11-2020
DOI: 10.1016/J.KINT.2020.07.023
Abstract: There is a huge gap between the number of patients worldwide requiring versus those actually receiving safe, sustainable, and equitable care for kidney failure. To address this, the International Society of Nephrology coordinated the development of a Strategic Plan for Integrated Care of Patients with Kidney Failure. Implementation of the plan will require engagement of the whole kidney community over the next 5-10 years.
Publisher: Wiley
Date: 10-2001
DOI: 10.1359/JBMR.2001.16.10.1863
Abstract: Osteoporosis is known to occur in patients with kidney transplants, but limited information is available about the prevalence and causes of this complication. We asked all 330 patients with kidney transplants in our unit to participate in this study of whom 165 (50%) agreed to do so. The characteristics of the participating patients were similar to the remaining 165 nonparticipants. Seventy of 165 (42%) of the participants were women of whom 40 were postmenopausal in contrast to the men of whom only one was hypogonadal. Bone mineral density (BMD) was significantly reduced at the radius (Z score, -1.5) and femoral neck (Z score, -0.7), but the lumbar spine was normal. BMD was lower in women than men at all skeletal sites. Osteoporosis was found in 10-44% and osteopenia was found in 35-50% of women depending on the site. BMD was related inversely to time since transplantation and cumulative prednisolone dose. Twenty-seven of the 165 (16%) patients had either vertebral deformities or a history of a low trauma fracture after transplantation. This fracture group consisted of 10/27 (37%) men and 17/27 (63%) women, of whom 14 were postmenopausal. Fracture patients tended to be older and have a longer duration of renal failure, dialysis, transplantation, greater cumulative steroid dose, and higher bone resorption markers than the nonfracture group. No differences were found for cumulative doses of cyclosporin or tacrolimus. Logistic regression showed that only duration of dialysis and time since transplantation significantly increased fracture risk, with odds ratio (OR) for each year of dialysis or transplantation being 1.21 (CI, 1.00-1.48) and 1.14 (CI, 1.05-1.23), respectively. These data show that low bone density and fractures are common in patients with kidney transplant and are determined by both pre- and posttransplant variables. Fracture risk was greatest in women, particularly if they were postmenopausal and we recommend that this subgroup is targeted for assessment and treatment.
Publisher: Oxford University Press (OUP)
Date: 03-03-2011
DOI: 10.1093/NDT/GFQ792
Abstract: A recent clinical trial showed harmful renal effects with the combined use of angiotensin-converting enzyme inhibitors (ACEI) and angiotensin-II receptor blockers (ARB) in people with diabetes or vascular disease. We examined the benefits and risks of these agents in people with albuminuria and one or more cardiovascular risk factors. MEDLINE, EMBASE and Renal Health Library were searched for trials comparing ACEI, ARB or their combination with placebo or with one another in people with albuminuria and one or more cardiovascular risk factor. Eighty-five trials (21,708 patients) were included. There was no significant reduction in the risk of all-cause mortality or fatal cardiac-cerebrovascular outcomes with ACEI versus placebo, ARB versus placebo, ACEI versus ARB or with combined therapy with ACEI + ARB versus monotherapy. There was a significant reduction in the risk of nonfatal cardiovascular events with ACEI versus placebo but not with ARB versus placebo, ACEI versus ARB or with combined therapy with ACEI + ARB versus monotherapy. Development of end-stage kidney disease and progression of microalbuminuria to macroalbuminuria were reduced significantly with ACEI versus placebo and ARB versus placebo but not with combined therapy with ACEI + ARB versus monotherapy. ACEI and ARB exert independent renal and nonfatal cardiovascular benefits while their effects on mortality and fatal cardiovascular disease are uncertain. There is a lack of evidence to support the use of combination therapy. A comparative clinical trial with ACE, ARB and its combination in people with albuminuria and a cardiovascular risk factor is warranted.
Publisher: Elsevier BV
Date: 03-2013
DOI: 10.1038/KI.2012.375
Abstract: There are few reports regarding outcomes of anti-glomerular basement membrane (GBM) disease in patients who underwent renal replacement therapy. To help define this we studied all patients with anti-GBM disease who started renal replacement therapy for end-stage renal disease (ESRD) in Australia and New Zealand (ANZDATA Registry) between 1963 and 2010 encompassing 449 in iduals (0.8 percent of all ESRD patients). The median survival on dialysis was 5.93 years with death predicted by older age and a history of pulmonary hemorrhage. Thirteen patients recovered renal function, although 10 subsequently experienced renal death after a median period of 1.05 years. Of the 224 patients who received their first renal allograft, the 10-year median patient and renal allograft survival rates were 86% and 63%, respectively. Six patients experienced anti-GBM disease recurrence in their allograft, which led to graft failure in two. Using multivariable Cox regression analysis, patients with anti-GBM disease had comparable survival on dialysis or following renal transplantation (hazard ratios of 0.86 and 1.03, respectively) compared to those with ESRD due to other causes. Also, renal allograft survival (hazard ratio of 1.03) was not altered compared to other diseases requiring a renal transplant. Thus, anti-GBM disease was an uncommon cause of ESRD, and not associated with altered risks of dialysis, transplant or first renal allograft survival. Death on dialysis was predicted by older age and a history of pulmonary hemorrhage.
Publisher: Wiley
Date: 12-2004
DOI: 10.1111/J.1440-1797.2004.00325.X
Abstract: Peritoneal transport of small solutes generally increases during the first month of peritoneal dialysis (PD). The aim of this study was to prospectively evaluate the ability of the peritoneal equilibration test (PET), carried out 1 and 4 weeks after the commencement of PD, to predict subsequent technique survival. Fifty consecutive patients commencing PD at the Princess Alexandra Hospital between 1 February 2001 and 31 May 2003 participated in the study. Paired 1 week and 1 month PET data were collated and correlated with subsequent technique survival. A significant increase was observed in the dialysate : plasma creatinine ratio at 4 h (D/P Cr) between 1 and 4 weeks after the onset of PD (0.55 +/- 0.12 vs 0.66 +/- 0.11, P or =20% rise in D/P Cr during the first month of PD compared with those who did not (2.3 +/- 0.2 vs 1.6 +/- 0.2 years, P <0.05). Using a multivariate Cox proportional hazards model analysis, the significant independent predictors of death-censored technique survival were an increase in D/P Cr of greater than 20% during the first month (adjusted hazard ratio [HR] 0.20, 95% CI 0.05-0.75), the absence of diabetes mellitus, the absence of ischaemic heart disease, body mass index and baseline peritoneal creatinine clearance. A 20% or greater rise in D/P Cr during the first month of commencing PD is independently predictive of PD technique survival. Further investigations of the mechanisms underlying this phenomenon are warranted.
Publisher: SAGE Publications
Date: 05-2023
DOI: 10.1177/08968608231172740
Abstract: Peritoneal dialysis (PD) catheter-related infections are important risk factors for catheter loss and peritonitis. The 2023 updated recommendations have revised and clarified definitions and classifications of exit site infection and tunnel infection. A new target for the overall exit site infection rate should be no more than 0.40 episodes per year at risk. The recommendation about topical antibiotic cream or ointment to catheter exit site has been downgraded. New recommendations include clarified suggestion of exit site dressing cover and updated antibiotic treatment duration with emphasis on early clinical monitoring to ascertain duration of therapy. In addition to catheter removal and reinsertion, other catheter interventions including external cuff removal or shaving, and exit site relocation are suggested.
Publisher: Elsevier BV
Date: 05-2017
DOI: 10.1053/J.JRN.2016.10.005
Abstract: Emerging evidence suggests that dietary patterns are associated with survival in people with chronic kidney disease (CKD). This study evaluated the relationship between dietary habits and renal-related clinical outcomes in an established CKD cohort. Prospective cohort study. Three outpatient nephrology clinics in Queensland, Australia. A total of 145 adult patients with Stage 3 or 4 CKD (estimated glomerular filtration rate 15-59 mL/minute/1.73 m Dietary intake was measured using 24-hour recall and the HeartWise Dietary Habits Questionnaire (DHQ), which evaluates 10 components of dietary patterns in relation to cooking habits and intake of food groups. The primary outcome was a composite end point of all-cause mortality, commencement of dialysis, and doubling of serum creatinine. Secondary outcome was all-cause mortality alone. Multivariate cox regression analyses calculated hazard ratios (HRs) for associations between DHQ domains and occurrence of composite outcome and adjusted for confounders, including comorbidities and renal function. Over a median follow-up of 36 months, 32% (n = 47) reached the composite end point, of which 21% died (n = 30). Increasing DHQ score was associated with a lower risk of the composite end point with increasing intake of fruits and vegetables (HR: 0.61 95% CI, 0.39-0.94) and limiting alcohol consumption (HR, 0.79 95% CI: 0.65-0.96). For the secondary outcome of all-cause mortality, there was a significant association with adequate intake of fruits and vegetables (HR: 0.35 95% CI, 0.15-0.83). Healthy dietary patterns consisting of adequate fruits and vegetables and limited alcohol consumption are associated with a delay in CKD progression and improved survival in patients with Stage 3 or 4 CKD.
Publisher: Springer Science and Business Media LLC
Date: 28-12-2017
Abstract: As the global burden of chronic kidney disease continues to increase, so does the need for a cost-effective renal replacement therapy. In many countries, patient outcomes with peritoneal dialysis are comparable to or better than those with haemodialysis, and peritoneal dialysis is also more cost-effective. These benefits have not, however, always led to increased utilization of peritoneal dialysis. Use of this therapy is increasing in some countries, including China, the USA and Thailand, but has proportionally decreased in parts of Europe and in Japan. The variable trends in peritoneal dialysis use reflect the multiple challenges in prescribing this therapy to patients. Key strategies for facilitating peritoneal dialysis utilization include implementation of policies and incentives that favour this modality, enabling the appropriate production and supply of peritoneal dialysis fluid at a low cost, and appropriate training for nephrologists to enable increased utilization of the therapy and to ensure that rates of technique failure continue to decline. Further growth in peritoneal dialysis use is required to enable this modality to become an integral part of renal replacement therapy programmes worldwide.
Publisher: Oxford University Press (OUP)
Date: 23-05-2016
DOI: 10.1093/NDT/GFW091
Abstract: With ever-accumulating medical evidence for treatment benefits and harms, it is vital that clinicians are able to access and use up-to-date, best evidence in specific clinical scenarios involving in idual patients-the primary goal of evidence-based medicine. In this article, we propose that meta-analysis, when properly conducted and reported in the context of a rigorous systematic review, is an indispensable tool for synthesis and interpretation of clinical evidence for the purpose of informing clinical decision-making by clinicians, patients and health care policy makers. Meta-analysis provides many benefits, including enhanced precision and statistical power, greater transparency, identification of bias, exploration of heterogeneity of effects, enhanced generalizability, efficient integration of clinical knowledge, identification of evidence gaps, better informed future trial design and avoidance of unnecessary research duplication and potential patient harm. The overall standard, clinical value and reach of meta-analysis has been further enhanced by the development of standards for registration, conduct and reporting, as well as advanced meta-analytic techniques, such as network meta-analysis. Of course, meta-analysis can at times be limited by poor quality studies, trial heterogeneity, publication bias and non-rigorous review and analysis, although through appraisal these issues are often able to be identified and explored, such that valuable clinical information can still be obtained. Consequently, meta-analysis is now the most highly cited form of research and is considered by many leading organizations to represent the highest level of clinical evidence. However, to maximize their considerable value, it is essential that all clinicians have the skills to critically appraise, carefully interpret and judiciously apply meta-analyses in their practice.
Publisher: SAGE Publications
Date: 06-02-2020
Publisher: Wiley
Date: 02-2006
DOI: 10.1111/J.1440-1797.2006.00550.X
Abstract: Low-protein diets (<or=0.7 g/kg per day) have been advocated for over 70 years as a means of slowing the rate of progression of kidney disease and delaying the appearance of uraemic symptoms and need for dialysis. However, the available evidence to date suggests that the benefit : risk ratio of dietary protein restriction is not favourable in that: (i) compliance is generally sub-optimal (ii) most of the published randomised controlled trials demonstrate that low-protein diets do not significantly slow the rate of kidney disease progression (iii) meta-analyses of controlled trials have demonstrated strong evidence of publication bias favouring studies with positive, rather than negative, results (iv) the optimal level and duration of dietary protein intake have not been defined (v) there is no convincing clinical evidence that dietary protein restriction provides any benefit beyond that afforded by angiotensin blockade and (vi) low-protein diets are associated with both statistically and clinically significant declines in nutritional markers in chronic kidney disease populations, which already have a high prevalence of malnutrition. Patients with progressive kidney disease are therefore likely to be better served by avoiding dietary protein restriction (thereby ensuring optimal preservation of their nutrition) and instituting alternative, proven renoprotective measures (e.g. renin-angiotensin system blockade, blood pressure reduction and statin therapy).
Publisher: American Physiological Society
Date: 15-03-2013
DOI: 10.1152/AJPRENAL.00540.2012
Abstract: Protease-activated receptor-2 (PAR2) is a G protein-coupled receptor abundantly expressed in the kidney. The aim of this study was to profile inflammatory gene and protein expression induced by PAR2 activation in human kidney tubular epithelial cells (HTEC). A novel PAR2 antagonist, GB88, was used to confirm agonist specificity. Intracellular Ca 2+ (iCa 2+ ) mobilization, confocal microscopy, gene expression profiling, qRTPCR, and protein expression were used to characterize PAR2 activation. PAR2 induced a pronounced increase in iCa 2+ concentration that was blocked by the PAR2 antagonist. Treatment with SLIGKV-NH 2 at the apical or basolateral cell surface for 5 h induced expression of a range of inflammatory genes by greater than fourfold, including IL-1β, TRAF1, IL-6, and MMP-1, as assessed by cDNA microarray and qRTPCR analysis. Using antibody arrays, GM-CSF, ICAM-1, TNF-α, MMP-1, and MMP-10 were among the induced proteins secreted. Cytokine-specific ELISAs identified three- to sixfold increases in GM-CSF, IL-6, IL-8, and TNF-α, which were blocked by GB88 and protein kinase C inhibitors. Treatment of cells at the basolateral surface induced more potent inflammatory responses, with release of MCP-1 and fibronectin to the apical and basolateral compartments apical treatment only increased secretion of these factors to the apical compartment. PAR2 activation at the basolateral surface dramatically reduced transepithelial electrical resistance (TEER) whereas apical treatment had no effect. There was very little leakage ( %) of peptides across the cell monolayer (liquid chromatography-mass spectrometry). In summary, SLIGKV-NH 2 induced robust proinflammatory responses in HTEC that were antagonized by GB88. These results suggest that PAR2 antagonists could be useful disease-modifying, anti-inflammatory agents in kidney disease.
Publisher: Wiley
Date: 10-03-2021
DOI: 10.1111/NEP.13859
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 08-2012
Publisher: Oxford University Press (OUP)
Date: 29-12-2012
DOI: 10.1093/NDT/GFR635
Abstract: Factors associated with erectile dysfunction in men on haemodialysis are incompletely identified due to suboptimal existing studies. We determined the prevalence and correlates of erectile dysfunction and identified combinations of clinical characteristics associated with a higher risk of erectile dysfunction using recursive partitioning and amalgamation (REPCAM) analysis. We conducted a multinational cross-sectional study in men on haemodialysis within a collaborative network. Erectile dysfunction and depressive symptoms were evaluated using the erectile function domain of the International Index of Erectile Function questionnaire and the Center for Epidemiological Studies-Depression Scale, respectively. Nine hundred and forty-six (59%) of 1611 eligible men provided complete data for erectile dysfunction. Eighty-three per cent reported erectile dysfunction and 47% reported severe erectile dysfunction. Four per cent of those with erectile dysfunction were receiving pharmacological treatment. Depressive symptoms were the strongest correlate of erectile dysfunction [adjusted odds ratio 2.41 (95% confidence interval (CI) 1.57-3.71)]. Erectile dysfunction was also associated with age (1.06, 1.05-1.08), being unemployed (1.80, 1.17-2.79) or receiving a pension (2.05, 1.14-3.69) and interdialytic weight gain (1.9-2.87 kg, 1.92 [CI 1.19-3.09] >2.87 kg, 1.57 [CI 1.00-2.45]). Married men had a lower risk of erectile dysfunction (0.49, 0.31-0.76). The prevalence of erectile dysfunction was highest (94%) in unmarried and unemployed or retired men who have depressive symptoms. Most men on haemodialysis experience erectile dysfunction and are untreated. Given the prevalence of this condition and the relative lack of efficacy data for pharmacological agents, we suggest that large trials of pharmacological and non-pharmacological interventions for erectile dysfunction and depression are needed.
Publisher: Public Library of Science (PLoS)
Date: 16-11-2020
DOI: 10.1371/JOURNAL.PONE.0242254
Abstract: Residual kidney function (RKF) is associated with improved survival and quality of life in dialysis patients. Previous studies have suggested that initiation of peritoneal dialysis (PD) may slow RKF decline compared to the pre-dialysis period. We sought to evaluate the association between PD initiation and RKF decline in the Initiating Dialysis Early And Late (IDEAL) trial. In this post hoc analysis of the IDEAL randomized controlled trial, PD participants were included if results from 24-hour urine collections had been recorded within 30 days of dialysis initiation, and at least one value pre- and one value post-dialysis commencement were available. The primary outcome was slope of RKF decline, calculated as mean of urinary creatinine and urea clearances. Secondary outcomes included slope of urine volume decline and time from PD initiation to anuria. The study included 151 participants (79 early start, 72 late start). The slope of RKF decline was slower after PD initiation (-2.69±0.18mL/min/1.73m 2 /yr) compared to before PD (-4.09±0.33mL/min/1.73m 2 /yr change in slope +1.19 mL/min/1.73m 2 /yr, 95%CI 0.48–1.90, p .001). In contrast, urine volume decline was faster after PD commencement (-0.74±0.05 L/yr) compared to beforehand (-0.57±0.06L/yr change in slope -0.18L/yr, 95%CI -0.34—-0.01, p = 0.04). No differences were observed between the early- and late-start groups with respect to RKF decline, urine volume decline or time to anuria. Initiation of PD was associated with a slower decline of RKF compared to the pre-dialysis period.
Publisher: Wiley
Date: 08-2012
DOI: 10.5694/MJA11.11468
Abstract: Optimal detection and subsequent risk stratification of people with chronic kidney disease (CKD) requires simultaneous consideration of both kidney function (glomerular filtration rate [GFR]) and kidney damage (as indicated by albuminuria or proteinuria). Measurement of urinary albuminuria and proteinuria is hindered by a lack of standardisation regarding requesting, s le collection, reporting and interpretation of tests. A multidisciplinary working group was convened with the goal of developing and promoting recommendations that achieve consensus on these issues. The working group recommended that the preferred method for assessment of albuminuria in both diabetic and non-diabetic patients is urinary albumin-to-creatinine ratio (UACR) measurement in a first-void spot urine specimen. Where a first-void specimen is not possible or practical, a random spot urine specimen for UACR is acceptable. The working group recommended that adults with one or more risk factors for CKD should be assessed using UACR and estimated GFR every 1-2 years, depending on their risk-factor profile. Recommended testing algorithms and sex-specific cut-points for microalbuminuria and macroalbuminuria are provided. The working group recommended that all pathology laboratories in Australia should implement the relevant recommendations as a vital component of an integrated national approach to detection of CKD.
Publisher: Oxford University Press (OUP)
Date: 14-05-2009
DOI: 10.1093/NDT/GFP216
Abstract: Recovery of dialysis-independent renal function in long-term dialysis patients has not been studied extensively. The aim of this study was to investigate the effect of dialysis modality on the likelihood, timing and durability of recovery of dialysis-independent renal function. The study reviewed all patients in Australia and New Zealand who commenced dialysis for treatment of end-stage renal disease (ESRD) between 1963 and 2006. Dialysis modality was assigned at 90 days. A supplementary analysis was also conducted using a contemporary cohort that included data on comorbidities, smoking and eGFR at dialysis onset. During the study period, 15 912 in iduals received peritoneal dialysis (PD) and 23 658 received haemodialysis (HD). Renal recovery occurred in 176 (1.1%) PD and 244 (1.0%) HD patients. Using multivariate Cox proportional hazards regression analyses, dialysis modality was not independently predictive of time to renal recovery (HR 0.92, 95% CI 0.76-1.13, P = 0.4). Recovery was significantly more likely in patients with higher baseline eGFR, with no hypertension or peripheral vascular disease, and with certain causes of kidney failure (autoimmune renal disease, haemolytic uraemic syndrome, interstitial nephritis, obstructive uropathy, paraproteinaemia and renovascular nephrosclerosis). Recovery was less likely in Maori/Pacific Islanders and polycystic kidney disease. Among patients who recovered, 328 (78%) subsequently experienced renal death, mostly within the first year. The duration of renal recovery was not associated with initial dialysis modality (OR 0.82, 95% CI 0.50- 1.32). Dialysis modality is not associated with the likelihood, timing or durability of spontaneous recovery of dialysis-independent renal function in patients thought to have ESRD.
Publisher: BMJ
Date: 25-02-2008
Publisher: SAGE Publications
Date: 07-2017
Publisher: Elsevier BV
Date: 10-1998
Publisher: S. Karger AG
Date: 10-02-2005
DOI: 10.1159/000083926
Abstract: i Background/Aims: /i Treatment of renal cell carcinoma (RCC) is limited by its resistance to conventional chemotherapies. This may occur, in part, from resistance to apoptosis. The role of caspase activation in apoptosis resistance in treated RCCs was investigated. i Methods: /i Two human RCC cell lines (ACHN and SN12K1) and renal tubular epithelial cells (HK2) were treated with 5-fluorouracil (0.2–20 µg/ml) or cisplatin (1–100 µ i M /i ). Activation of caspase-3 and -2 was analysed and compared with levels of apoptosis. Caspase function was analysed using pan-caspase inhibition (z-VAD-fmk) and caspase-2 inhibition (z-VDVAD-fmk). i Results: /i RCC apoptosis was significantly lower (p 0.05) than in HK2s after treatment, confirming their chemoresistance. Pro-caspase-3 (32 kDa) was detected in all cell lines. Cleaved caspase-3 (19 kDa) was not detected by Western immunoblots in treated RCCs and only minimal activated caspase-3 was detected in treated RCCs using immunohistochemistry. All cells had pro-caspase-2 (48 kDa) and the activated form (33 kDa) appeared in all treated cells. Caspase inhibition caused a reduction in, but not negation of, therapy-induced apoptosis in HK2s and RCCs (p 0.05 for HK2s and ACHN cells), indicating that a caspase activation pathway must occur in RCC apoptosis but this pathway does not act via caspase-3 cleavage. Inhibition of caspase-2 reduced apoptosis only in HK2s, indicating that the activated caspase-2, identified in treated RCCs, was not responsible for their apoptosis induction. i Conclusion: /i Specific differences in caspase-3 and -2 activation were identified in renal tubular epithelium and RCCs after chemotherapy. Identification of RCC-specific caspase inactivation or redundancy may explain, in part, the resistance of RCCs to cancer therapies and may be useful in targeting apoptotic pathways to overcome RCC resistance to treatment.
Publisher: Oxford University Press (OUP)
Date: 10-12-2020
DOI: 10.1093/NDT/GFZ238
Abstract: There has been little research on strategies for prevention of peritoneal dialysis (PD)-related peritonitis. We explored whether regular retraining on bag exchanges (via two methods: technique inspection and oral education) every other month could help reduce the risk of peritonitis in PD patients through a randomized controlled trial (RCT). This is an RCT conducted at Peking University First Hospital. A total of 150 incident patients receiving PD at our centre were included between December 2010 and June 2016 and followed up until June 2018. Patients were randomly assigned 1:1:1 to receive retraining on bag exchange via technique inspection, oral education or usual care. The primary outcome was time to the first peritonitis episode. Secondary outcomes were time to organism-specific peritonitis, transfer to haemodialysis and all-cause death. Patients in the technique inspection group, oral education group and usual care group (n = 50 for each group) were followed up for 47.5 ± 22.9 months. Time to first peritonitis was comparable between the groups. The technique inspection group showed a lower risk of first non-enteric peritonitis than the usual care group, while the oral education group did not show a significant benefit. The incidence of first non-enteric peritonitis in the usual care group (0.07 atient-year) was significantly higher than that in the technique inspection group (0.02 atient-year P & 0.01) but was comparable with that in the oral education group (0.06 atient-year). Transfer to haemodialysis and all-cause mortality were not significantly different between the groups. Neither technique inspection nor oral education significantly altered the risk of all-cause peritonitis compared with usual care, despite technique inspection showing a trend towards reducing the risk of non-enteric PD-related peritonitis. ClinicalTrials.gov (NCT01621997).
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 04-2021
Abstract: This study examined patient and center factors associated with arteriovenous fistula/graft access use at hemodialysis commencement. Arteriovenous access use at hemodialysis commencement varied four-fold from 15% to 62% (median 39%) across centers. There is substantial variability in arteriovenous access use across centers. Commencing hemodialysis (HD) with an arteriovenous access is associated with superior patient outcomes compared with a catheter, but the majority of patients in Australia and New Zealand initiate HD with a central venous catheter. This study examined patient and center factors associated with arteriovenous fistula/graft access use at HD commencement. We included all adult patients starting chronic HD in Australia and New Zealand between 2004 and 2015. Access type at HD initiation was analyzed using logistic regression. Patient-level factors included sex, age, race, body mass index (BMI), smoking status, primary kidney disease, late nephrologist referral, comorbidities, and prior RRT. Center-level factors included size transplant capability home HD proportion incident peritoneal dialysis (average number of patients commencing RRT with peritoneal dialysis per year) mean weekly HD hours average blood flow and achievement of phosphate, hemoglobin, and weekly Kt/V targets. The study included 27,123 patients from 61 centers. Arteriovenous access use at HD commencement varied four-fold from 15% to 62% (median 39%) across centers. Incident arteriovenous access use was more likely in patients aged 51–72 years, males, and patients with a BMI of kg/m 2 and polycystic kidney disease but use was less likely in patients with a BMI of .5 kg/m 2 , late nephrologist referral, diabetes mellitus, cardiovascular disease, chronic lung disease, and prior RRT. Starting HD with an arteriovenous access was less likely in centers with the highest proportion of home HD, and no center factor was associated with higher arteriovenous access use. Adjustment for center-level characteristics resulted in a 25% reduction in observed intercenter variability of arteriovenous access use at HD initiation compared with the model adjusted for only patient-level characteristics. This study identified several patient and center factors associated with incident HD access use, yet these factors did not fully explain the substantial variability in arteriovenous access use across centers.
Publisher: Informa UK Limited
Date: 12-2009
DOI: 10.1586/ERI.09.100
Abstract: Access-related infections (ARIs), such as exit-site infections, tunnel infections, bacteremia, fungemia and peritonitis, are the Achilles' heel of dialysis, and contribute significantly to morbidity, mortality and excess healthcare costs in hemodialysis and peritoneal dialysis patient populations. Despite international guidelines recommending the avoidance of catheters for hemodialysis access, hospital admissions for vascular ARIs have doubled in the last decade. Moreover, repeated use of antibiotics to treat ARIs has been associated with the selection of multiresistant organisms, such as methicillin-resistant Staphylococcus aureus and vancomycin-resistant enterococci. ARIs result from direct inoculation of skin organisms during access cannulation/connection, migration of skin organisms along dialysis catheters into the bloodstream or peritoneal cavity, or contamination and colonization of catheter lumens with subsequent biofilm formation. This paper will review the epidemiology, pathogenesis and prevention of ARIs. It will focus specifically on randomized, controlled trial evidence in relation to the safety and efficacy of aseptic techniques, nasal eradication of S. aureus, oral antimicrobial prophylaxis, topical antimicrobial prophylaxis (including disinfectants, antibiotics and antibacterial honey), antimicrobial catheter lock solutions (including gentamicin, citrate and ethanol), antimicrobial-impregnated catheters, catheter design (straight vs coiled, single vs double cuff), peritoneal dialysis catheter connectology, catheter insertion technique, germicidal devices, vaccines and preinsertion antibiotic prophylaxis.
Publisher: Springer Science and Business Media LLC
Date: 10-01-2014
Publisher: Oxford University Press (OUP)
Date: 11-2004
DOI: 10.1086/425125
Abstract: Although extrapulmonary manifestations of Mycoplasma pneumoniae infection are generally immune-mediated, disseminated infections occasionally occur. We describe a renal transplant recipient who developed disseminated pyogenic M. pneumoniae infection that was detected by broad-range polymerase chain reaction and that manifested by prosthetic arterial graft infection, psoas abscess, and septic arthritis.
Publisher: Springer Science and Business Media LLC
Date: 13-03-2018
Publisher: Elsevier BV
Date: 03-2005
DOI: 10.1111/J.1523-1755.2005.00157.X
Abstract: A recent in vitro model of oxidative stress-induced renal fibrosis demonstrated that activated or phosphorylated extracellular signal-regulated protein kinase (pERK) played a role in apoptosis of renal fibroblasts, but not tubular epithelium where it promoted cell growth and survival. The present study utilized an in vivo model of renal fibrosis after unilateral ureteral obstruction (UUO) to examine the relationship between pERK, apoptosis, proliferation, and differentiation in renal fibroblast and tubular epithelial cells, in comparison with the in vitro results. UUO was induced in rats for 0 (controls, untreated), 6, and 24 hours, 2, 4, and 7 days (N= 4), and tissue analyzed for fibrotic characteristics using microscopy and special stains, Western immunoblots and reverse transcription-polymerase chain reaction (RT-PCR). Controls and UUO animals were also treated with vitamin E, N-acetylcysteine (NAC), or fluvastatin to assess any antioxidant effect on attenuation of fibrosis and pERK expression. Azan stain and alpha-smooth muscle actin (alpha-SMA), collagen III, and fibronectin expression confirmed development of UUO-induced fibrosis. Oxidative stress markers heme oxygenase-1 (HO-1) and 8-hydroxy-2'-deoxyguanosine (8-OHdG) confirmed oxidative stress at all UUO time points. Tubular epithelial and interstitial mitosis and apoptosis were significantly increased over controls at 2 to 7 days after UUO (P < 0.01). The pERK/ERK ratio increased significantly at 1 to 7 days of UUO in comparison with controls (three- to fivefold, P < 0.05). There was a significant spatiotemporal correlation between pERK and tubular epithelial proliferation (P < 0.001). pERK occasionally colocalized with apoptotic cells (dual labeling) in the interstitium but not in the tubular epithelium. Fluvastatin was the only treatment that attenuated fibrosis (decreased alpha-SMA, fibronectin, tubular epithelial apoptosis) and it also significantly decreased expression of 8-OHdG at 2 and 7 days (P < 0.05). It was associated with decreased pERK at 7 days, compared with UUO alone (P < 0.05). Promotion of tubular epithelial proliferation and survival, and interstitial cell apoptosis, may minimize renal fibrosis after UUO. In the present study, both were linked spatially and temporally with increased pERK expression. Fluvastatin treatment attenuated UUO-induced fibrosis via an antioxidant and pERK-related mechanism.
Publisher: Oxford University Press (OUP)
Date: 13-05-2017
DOI: 10.1093/NDT/GFX054
Publisher: Wiley
Date: 12-2004
DOI: 10.1111/J.1440-1797.2004.00338.X
Abstract: Poor phosphate control is common among patients with end-stage renal disease. Sevelamer hydrochloride has been demonstrated to be a safe and effective phosphate binder when used as a monotherapy. However, cost limits its usefulness in many countries. Data assessing its effectiveness and safety in combination with conventional phosphate binders are lacking. Dialysis patients meeting the following inclusion criteria participated in this study: (i) hyperphosphataemia >1.8 mmol/L (5.6 mg/dL) and (ii) an inability to tolerate currently available binders. The trial was conducted in three phases each lasting 3 months: (i) an observation phase (patients continued on their regular phosphate binders) (ii) a titration phase (sevelamer was added at a dose of 403 mg three times daily with meals, titrated to a maximum of 1209 mg three times daily) and (iii) a maintenance phase. Twenty-five patients were recruited into the study. Eighteen patients completed all three trial phases. Mean serum phosphate dropped from 2.11 +/- 0.06 mmol/L (6.6 +/- 0.2 mg/dL) during the observation period to 1.91 +/- 0.01 mmol/L (5.9 +/- 0.003 mg/dL) during the maintenance phase (P=0.02). Calcium x phosphate product fell from 5.49 +/- 0.17 mmol2/L2 (68.64 +/- 2.11 mg2 dL2) to 4.89 +/- 0.27 mmol2/L2 (61.36 +/- 3.35 mg2 dL2) (P=0.02). There was no significant change in serum calcium or parathyroid hormone. Total serum cholesterol fell from 3.8 mmol/L (3.4-4.37) 147 mg/dL (131-169) to 3.55 mmol/L (2.97-4.2) 137 mg/dL (115-162) (P=0.02). Serum low-density lipoprotein cholesterol also fell significantly from 1.67 +/- 0.10 mmol/L (65 +/- 4 mg/dL) to 1.52 +/- 0.11 mmol/L (59 +/- 4 mg/dL) (P=0.04). The average prescribed dose of sevelamer was 2.4 g/day. Elemental calcium dropped from 3.4 g/day (1.4 to 4.6) to 1.2 g/day (0.6-2.4) (P=0.04). Seventy-two per cent of patients reported mild flatulence, nausea and indigestion. Three patients discontinued treatment because of adverse effects. Sevelamer in combination with conventional phosphate binders is effective in lowering serum phosphate and calcium-phosphate product in patients with refractory hyperphosphataemia. Beneficial effects on lipid profile were also observed. Mild gastrointestinal upset is common.
Publisher: Elsevier BV
Date: 2014
DOI: 10.1038/KI.2013.391
Publisher: Elsevier BV
Date: 07-2013
DOI: 10.1038/KI.2013.77
Abstract: Prevalence estimates of depression in chronic kidney disease (CKD) vary widely in existing studies. We conducted a systematic review and meta-analysis of observational studies to summarize the point prevalence of depressive symptoms in adults with CKD. We searched MEDLINE and Embase (through January 2012). Random-effects meta-analysis was used to estimate the prevalence of depressive symptoms. We also limited the analyses to studies using clinical interview and prespecified criteria for diagnosis. We included 249 populations (55,982 participants). Estimated prevalence of depression varied by stage of CKD and the tools used for diagnosis. Prevalence of interview-based depression in CKD stage 5D was 22.8% (confidence interval (CI), 18.6-27.6), but estimates were somewhat less precise for CKD stages 1-5 (21.4% (CI, 11.1-37.2)) and for kidney transplant recipients (25.7% (12.8-44.9)). Using self- or clinician-administered rating scales, the prevalence of depressive symptoms for CKD stage 5D was higher (39.3% (CI, 36.8-42.0)) relative to CKD stages 1-5 (26.5% (CI, 18.5-36.5)) and transplant recipients (26.6% (CI, 20.9-33.1)) and suggested that self-report scales may overestimate the presence of depression, particularly in the dialysis setting. Thus, interview-defined depression affects approximately one-quarter of adults with CKD. Given the potential prevalence of depression in the setting of CKD, randomized trials to evaluate effects of interventions for depression on patient-centered outcomes are needed.
Publisher: Springer Science and Business Media LLC
Date: 08-2011
DOI: 10.2165/11593890-000000000-00000
Abstract: Tacrolimus is a cornerstone immunosuppressant agent in the prevention of organ rejection following transplantation. While typically administered twice daily (Prograf®), a modified-release once-daily formulation (Advagraf®) has recently been developed and licensed for use. To date, the majority of published data relating to the use of Advagraf® have arisen from industry-sponsored clinical trials. These have shown that conversion from Prograf® to Advagraf® on a 1 mg : 1 mg basis in both stable and de novo kidney and liver transplant recipients yields lower peak concentrations (C(max)) but equivalent overall drug exposure (area under the concentration-time curve from 0 to 24 hours post-dose AUC(24)) and trough concentrations (C(min)). This has led to the proposal that the same total daily dose, target C(min) and therapeutic drug monitoring (TDM) strategies can be applied irrespective of preparation. However, while Advagraf® fulfils criteria for bioequivalence according to the European Medicines Agency and US FDA, lower tacrolimus exposure has been observed in the majority of clinical studies, particularly in the early post-transplant period. This has resulted in a need for higher doses of Advagraf® compared with Prograf® to achieve similar C(min) values. Significant between-subject variability in the C(min)/AUC(24) relationship with Advagraf® has also been demonstrated, suggesting possible problems with TDM based on C(min) values. In non-comparative conversion studies, Advagraf® demonstrated similar efficacy and safety to Prograf®. However, phase III studies in de novo kidney and liver transplant recipients have shown higher rates of acute rejection with Advagraf®, possibly explained by the differing C(max) values achieved with the two preparations. While it has been suggested that once-daily administration may improve compliance, no studies have proven this to be the case. This article reviews the pharmacokinetics, efficacy, adverse effects and utility of Advagraf® in relation to its equivalence to Prograf®, and areas that require additional research are identified.
Publisher: Wiley
Date: 02-2009
DOI: 10.1111/J.1440-1797.2008.01017.X
Abstract: Renal fibrosis is central to progression of most chronic renal pathologies. Antioxidants that protect the tubular epithelium and anti-fibrotics that induce apoptosis of pro-fibrotic myofibroblasts without adversely affecting tubular epithelium may slow progression of renal fibrosis, while toxic substances may exacerbate renal scarring. We investigated 47 herbs for their in vitro toxic or antioxidant effects on normal renal mammalian fibroblasts (NRK49F) and tubular epithelial cells (NRK52E) to determine their potential value as therapeutic agents in renal fibrosis involving oxidative stress. Herbs were chosen because of their traditional use in kidney or urinary system disorders, or because of recent published interest in their therapeutic or toxic potential in kidney disease. Extracts of herbs were made using a sequential multi-solvent extraction process. Each extract was analysed separately. Extraction solvents were ethyl acetate, methanol and 50% aqueous methanol. Cells were treated with extracts with/without oxidative stress (1.0 mM hydrogen peroxide). Cellular changes (apoptosis, necrosis, mitosis, transdifferentiation) were identified and quantified using defined criteria. All extracts of Dioscorea villosa showed significant toxicity to both cell lines. At low concentrations (5-50 microg/mL) they induced epithelial to mesenchymal transdifferentiation, as demonstrated by increased immunohistochemistry staining for alpha-smooth muscle actin and transforming growth factor-beta1 in treated versus control cells. Angelica sinensis, Centella asiatica, Glycyrrhiza glabra, Scutellaria lateriflora, and Olea europaea demonstrated strong antioxidant effects in epithelial cells and/or apoptotic effects on fibroblasts. This investigation has revealed renotoxicity of D. villosa and anti-fibrotic, oxidant potential of several herbal extracts, all of which require further study.
Publisher: Wiley
Date: 31-01-2003
DOI: 10.1046/J.1440-1797.2003.00117.X
Abstract: Icodextrin is a starch-derived, high molecular weight glucose polymer, which has been shown to promote sustained ultrafiltration equivalent to that achieved with hypertonic (3.86%/4.25%) glucose exchanges during prolonged intraperitoneal dwells (up to 16 h). Patients with impaired ultrafiltration, particularly in the settings of acute peritonitis, high transporter status and diabetes mellitus, appear to derive the greatest benefit from icodextrin with respect to augmentation of dialytic fluid removal, amelioration of symptomatic fluid retention and possible prolongation of technique survival. Glycaemic control is also improved by substituting icodextrin for hypertonic glucose exchanges in diabetic patients. Preliminary in vitro and ex vivo studies suggest that icodextrin demonstrates greater peritoneal membrane biocompatibility than glucose-based dialysates, but these findings need to be confirmed by long-term clinical studies. This paper reviews the available clinical evidence pertaining to the safety and efficacy of icodextrin and makes recommendations for its use in peritonal dialysis.
Publisher: S. Karger AG
Date: 28-10-1999
DOI: 10.1159/000020626
Abstract: The clinical utility of cyclosporin A (CyA) as an immunosuppressive agent has been significantly limited by the frequent occurrence of chronic nephrotoxicity, characterised by tubular atrophy, interstitial fibrosis and progressive renal impairment. The pathogenesis of this condition remains poorly understood, but has been postulated to be due to either direct cytotoxicity or indirect injury secondary to chronic renal vasoconstriction. Using primary cultures of human proximal tubule cells (PTCs) and renal cortical fibroblasts (CFs) as an in vitro model of the tubulointerstitium, we have been able to demonstrate that clinically relevant concentrations of CyA are directly toxic to these cells and promote fibrogenesis by a combination of suppressed matrix metalloproteinase activity and augmented fibroblast collagen synthesis. The latter effect occurs secondary to the ability of CyA to stimulate autocrine secretion of insulin-like growth factor-I by CFs and paracrine secretion of transforming growth factor-β sub /sub by PTCs. Many of these pro-fibrotic mechanisms are completely reversed by concurrent administration of the angiotensin-converting enzyme inhibitor, enalaprilat, which has proven efficacy in preventing chronic CyA nephropathy in vivo. These studies highlight the unique potential that human renal cell cultures offer for studying the role of local cytokine networks in tubulointerstitial disease and for developing more effective treatment strategies which specifically target fibrogenic growth factor activity following nephrotoxic injuries.
Publisher: Elsevier BV
Date: 05-2019
DOI: 10.1053/J.JRN.2018.08.008
Abstract: Gut dysbiosis has been implicated in the pathogenesis of chronic kidney disease (CKD). Restoring gut microbiota with prebiotic, probiotic, and synbiotic supplementation has emerged as a potential therapeutic intervention but has not been systematically evaluated in the CKD population. This is a systematic review. A structured search of MEDLINE, CINAHL, EMBASE, Cochrane Central Register of Controlled Trials, and the International Clinical Trials Register Search Portal was conducted for articles published since inception until July 2017. Included studies were randomized controlled trials investigating the effects of prebiotic, probiotic, and/or synbiotic supplementation (>1 week) on uremic toxins, microbiota profile, and clinical and patient-centered outcomes in adults and children with CKD. Sixteen studies investigating 645 adults met the inclusion criteria 5 investigated prebiotics, 6 probiotics, and 5 synbiotics. The quality of the studies (Grades of Recommendation, Assessment, Development and Evaluation) ranged from moderate to very low. Prebiotic, probiotic, and synbiotic supplementation may have led to little or no difference in serum urea (9 studies, 345 participants: mean difference [MD] -0.30 mmol/L, 95% confidence interval [CI] -2.20 to 1.61, P = .76, I There is limited evidence to support the use of prebiotics, probiotics, and/or synbiotics in CKD management.
Publisher: Springer Science and Business Media LLC
Date: 25-05-2020
DOI: 10.1186/S13063-020-04359-2
Abstract: Delayed graft function, the requirement for dialysis due to poor kidney function post-transplant, is a frequent complication of deceased donor kidney transplantation and is associated with inferior outcomes and higher costs. Intravenous fluids given during and after transplantation may affect the risk of poor kidney function after transplant. The most commonly used fluid, isotonic sodium chloride (0.9% saline), contains a high chloride concentration, which may be associated with acute kidney injury, and could increase the risk of delayed graft function. Whether using a balanced, low-chloride fluid instead of 0.9% saline is safe and improves kidney function after deceased donor kidney transplantation is unknown. BEST-Fluids is an investigator-initiated, pragmatic, registry-based, multi-center, double-blind, randomized controlled trial. The primary objective is to compare the effect of intravenous Plasma-Lyte 148 (Plasmalyte), a balanced, low-chloride solution, with the effect of 0.9% saline on the incidence of delayed graft function in deceased donor kidney transplant recipients. From January 2018 onwards, 800 participants admitted for deceased donor kidney transplantation will be recruited over 3 years in Australia and New Zealand. Participants are randomized 1:1 to either intravenous Plasmalyte or 0.9% saline peri-operatively and until 48 h post-transplant, or until fluid is no longer required whichever comes first. Follow up is for 1 year. The primary outcome is the incidence of delayed graft function, defined as dialysis in the first 7 days post-transplant. Secondary outcomes include early kidney transplant function (composite of dialysis duration and rate of improvement in graft function when dialysis is not required), hyperkalemia, mortality, graft survival, graft function, quality of life, healthcare resource use, and cost-effectiveness. Participants are enrolled, randomized, and followed up using the Australia and New Zealand Dialysis and Transplant (ANZDATA) Registry. If using Plasmalyte instead of 0.9% saline is effective at reducing delayed graft function and improves other clinical outcomes in deceased donor kidney transplantation, this simple, inexpensive change to using a balanced low-chloride intravenous fluid at the time of transplantation could be easily implemented in the vast majority of transplant settings worldwide. Australian New Zealand Clinical Trials Registry: ACTRN12617000358347 . Registered on 8 March 2017. ClinicalTrials.gov: NCT03829488 . Registered on 4 February 2019.
Publisher: Wiley
Date: 21-07-2006
DOI: 10.1111/J.1440-1797.2006.00585.X
Abstract: Erythropoietin (EPO) has been used widely for the treatment of anaemia associated with chronic kidney disease and cancer chemotherapy for nearly 20 years. More recently, EPO has been found to interact with its receptor (EPO-R) expressed in a large variety of non-haematopoietic tissues to induce a range of cytoprotective cellular responses, including mitogenesis, angiogenesis, inhibition of apoptosis and promotion of vascular repair through mobilization of endothelial progenitor cells from the bone marrow. Administration of EPO or its analogue, darbepoetin, promotes impressive renoprotection in experimental ischaemic and toxic acute renal failure, as evidenced by suppressed tubular epithelial apoptosis, enhanced tubular epithelial proliferation and hastened functional recovery. This effect is still apparent when administration is delayed up to 6 h after the onset of injury and can be dissociated from its haematological effects. Based on these highly encouraging results, at least one large randomized controlled trial of EPO therapy in ischaemic acute renal failure is currently underway. Preliminary experimental and clinical evidence also indicates that EPO may be renoprotective in chronic kidney disease. The purpose of the present article is to review the renoprotective benefits of different protocols of EPO therapy in the settings of acute and chronic kidney failure and the potential mechanisms underpinning these renoprotective actions. Gaining further insight into the pleiotropic actions of EPO will hopefully eventuate in much-needed, novel therapeutic strategies for patients with kidney disease.
Publisher: SAGE Publications
Date: 11-2017
Abstract: Worldwide, approximately 11% of patients on dialysis receive peritoneal dialysis (PD). Whilst PD may offer more autonomy to patients compared with hemodialysis, patient and caregiver burnout, technique failure, and peritonitis remain major challenges to the success of PD. Improvements in care and outcomes are likely to be mediated by randomized trials of innovative therapies, but will be limited if the outcomes measured and reported are not important for patients and clinicians. The aim of the Standardised Outcomes in Nephrology-Peritoneal Dialysis (SONG-PD) study is to establish a set of core outcomes for trials in patients on PD based on the shared priorities of all stakeholders, so that outcomes of most relevance for decision-making can be evaluated, and that interventions can be compared reliably. The 5 phases in the SONG-PD project are: a systematic review to identify outcomes and outcome measures that have been reported in randomized trials involving patients on PD focus groups using nominal group technique with patients and caregivers to identify, rank, and describe reasons for their choice of outcomes semi-structured key informant interviews with health professionals a 3-round international Delphi survey involving a multi-stakeholder panel and a consensus workshop to review and endorse the proposed set of core outcome domains for PD trials. The establishment of 3 to 5 high-priority core outcomes, to be measured and reported consistently in all trials in PD, will enable patients and clinicians to make informed decisions about the relative effectiveness of interventions, based upon outcomes of common importance.
Publisher: Oxford University Press (OUP)
Date: 02-07-2009
DOI: 10.1093/NDT/GFP322
Abstract: Infection due to Corynebacterium species has been reported with increasing frequency over recent decades. The impacts of enhanced laboratory detection together with widespread use of new peritoneal dialysis (PD) connection technology and antimicrobial prophylaxis strategies on Corynebacterium PD-associated peritonitis have not been well studied. We investigated the frequency, predictors, treatment and clinical outcomes of Corynebacterium peritonitis in all Australian adult patients involving 66 centres who were receiving PD between 1 October 2003 and 31 December 2006. Eighty-two episodes of Corynebacterium peritonitis (2.3% of all peritonitis episodes) occurred in 65 (1.4%) PD patients. Ten (15%) patients experienced more than one episode of Corynebacterium peritonitis and additional organisms were isolated in 12 (15%) episodes of Corynebacterium peritonitis. The incidence of Corynebacterium peritonitis was significantly and independently predicted only by BMI: RR 2.72 (95% CI 1.38-5.36) for the highest tertile BMI compared with the lowest tertile. The overall cure rate with antibiotics alone was 67%, which was similar to that of peritonitis due to other organisms. Vancomycin was the most common antimicrobial agent administered in the initial empiric and subsequent antibiotic regimens, although outcomes were similar regardless of antimicrobial schedule. Corynebacterium peritonitis not infrequently resulted in relapse (18%), repeat peritonitis (15%), hospitalization (70%), catheter removal (21%), permanent haemodialysis transfer (15%) and death (2%). The in iduals who had their catheters removed more than 1 week after the onset of Corynebacterium peritonitis had a significantly higher risk of permanent haemodialysis transfer than those who had their catheters removed within 1 week (90% versus 43%, P < 0.05). Corynebacterium is an uncommon but significant cause of PD-associated peritonitis. Complete cure with antibiotics alone is possible in the majority of patients, and rates of adverse outcomes are comparable to those seen with peritonitis due to other organisms. Use of vancomycin rather than cephazolin as empiric therapy does not impact outcomes, and a 2-week course of antibiotic therapy appears sufficient. If catheter removal is required, outcomes are improved by removing the catheter within 1 week of peritonitis onset.
Publisher: Wiley
Date: 2011
Publisher: Oxford University Press (OUP)
Date: 16-07-2008
DOI: 10.1093/NDT/GFN385
Publisher: SAGE Publications
Date: 2020
Abstract: Recognition of the discrepancy between the research priorities of patients and health professionals has prompted efforts to involve patients as active contributors in research activities, including scientific conferences. However, there is limited evidence about the experience, challenges, and impacts of patient involvement to inform best practice. This study aims to describe patient and health professional perspectives on patient involvement at the Congress of the International Society for Peritoneal Dialysis (ISPD). Semi-structured interviews were conducted with 14 patients/caregivers and 15 health professionals from six countries who attended ISPD. Interviews were recorded and transcribed verbatim, and transcripts were analyzed thematically. We identified four themes: protecting and enhancing scientific learning (grounding science in stories, sharing and inspiring new perspectives, distilling the key messages of research presentations, striking a balance between accommodating patients and presenting the science) democratizing access to research (redistributing power, challenging the traditional ownership of knowledge, cultivating self-management through demystifying research) inadequate support for patient/caregiver delegates (lacking purposeful inclusion, challenges in interpreting research findings, soliciting medical advice, difficulty negotiating venue and program, limited financial assistance in attending) and lifying impact beyond the room (sparking innovation in practice, giving patients and families hope for the future). Patient involvement at the ISPD Congress clarified the applicability of research to patient care and self-management, democratized science, and strengthened the potential impact of research. More structured support for patients to help them purposefully articulate their experience in relation to session objectives may enhance their contribution and their own learning experience.
Publisher: SAGE Publications
Date: 11-2017
Abstract: Corynebacterium is a rare cause of peritonitis that is increasingly being recognized in peritoneal dialysis (PD) patients. The aims of this study were to compare Corynebacterium peritonitis outcomes with those of peritonitis caused by other organisms and to examine the effects of type and duration of antibiotic therapy on outcomes of Corynebacterium peritonitis. Using Australia and New Zealand Dialysis and Transplant Registry (ANZDATA) data, we included all PD patients who developed peritonitis in Australia between 2004 and 2014. The primary outcome was peritonitis cure by antibiotic therapy, defined as resolution of a peritonitis episode with antibiotics alone and without being complicated by recurrence, relapse, catheter removal, hemodialysis transfer, or death. Peritonitis outcomes were analyzed using multivariable logistic regression. A total of 11,122 episodes of peritonitis in 5,367 patients were included. Of these, 162 episodes (1.5%) were due to Corynebacterium. Compared with Corynebacterium peritonitis, the odds of cure were lower in peritonitis due to Staphylococcus aureus (odds ratio [OR] 0.66, 95% confidence interval [CI] 0.45 – 0.97), Pseudomonas (OR 0.22, 95% CI 0.14 – 0.33), other gram-negative organisms (OR 0.52, 95% CI 0.35 – 0.75), fungi (OR 0.02, 95% CI 0.01 – 0.03), polymicrobial organisms (OR 0.32, 95% CI 0.22 – 0.47), and other organisms (OR 0.66, 95% CI 0.44 – 0.99) but similar for culture-negative and other gram-positive peritonitis. Similar results were observed for hemodialysis transfer and death. The outcomes of Corynebacterium peritonitis were not associated with the type of initial antibiotic selected (vancomycin vs cefazolin) or the duration of antibiotic therapy (≤ 14 days vs 14 days). Outcomes for Corynebacterium peritonitis are generally favorable compared with other forms of peritonitis. Cure rates did not appear to differ if peritonitis was treated initially with vancomycin or cefazolin or if treatment duration was prolonged beyond 14 days.
Publisher: Wiley
Date: 20-02-2018
DOI: 10.1111/NEP.12992
Abstract: Up to a 10-fold difference in clinical outcomes between Australian peritoneal dialysis (PD) units exists. There is an international focus on the harmonization of educational practices in PD to determine whether this may lead to improved patient outcomes. The aim of this paper is to evaluate the current teaching practices of nurses and patients in Australian PD units. An online survey with questions on nurse and patient training was made available to PD units in Australia. Thirty-eight (70%) of 54 PD units in Australia completed the survey. A written standardized curricula was utilized in 21 units (55%) for nursing staff and 30 units (79%) for patients, with 23% and 12% including an electronic delivery component for each group, respectively. Universal teaching of adult learning principles was not demonstrated. The hours spent on teaching nursing staff ranged from 100 h in 21% of units. The average number of hours spent by nurses each day to train patients ranged from 6 h in 11% of units, with the average total training days ranging from 2 to 3 days in 14% to over 7 days in 14% of units. Staff and patient competency assessments were performed routinely in 37% and 74% of units, respectively. Considerable differences exist amongst Australian PD units in the education of staff and patients. There is a general lack of delivery and competency assessment to meet educational standards. It remains to be seen if harmonization of educational curricula can translate to improved clinical outcomes.
Publisher: Oxford University Press (OUP)
Date: 21-02-2018
DOI: 10.1093/NDT/GFY016
Abstract: Vitamin D deficiency is highly prevalent in patients on dialysis. Although vitamin D deficiency is closely associated with cardiovascular disease (CVD) and high mortality in the general population, the relationship between serum 25-hydroxyvitamin D [25(OH)D] and all-cause and cardiovascular mortality in dialysis patients is uncertain. We aim to explore the relationship between serum 25(OH)D levels and all-cause and cardiovascular mortality in dialysis patients. This is a systematic review and meta-analysis of clinical studies among patients receiving maintenance dialysis. We did a systematic literature search in PubMed and Embase to identify studies reporting the relationship between serum 25(OH)D levels and all-cause and cardiovascular mortality in patients on dialysis. The search was last updated on 10 February 2017. The study included 18 moderate to high-quality cohort studies with an overall s le of 14 154 patients on dialysis. The relative risk of all-cause mortality per 10 ng/mL increase in serum 25(OH)D level was 0.78 [95% confidence interval (CI) 0.71-0.86], although there was marked heterogeneity (I2=96%, P < 0.01) that was partly explained by differences in CVD prevalence, baseline parathyroid hormone level and dialysis duration among included studies. The relative risk of cardiovascular mortality per 10 ng/mL increase in serum 25(OH)D level was 0.71 (95% CI 0.63-0.79), with substantial heterogeneity (I2=74%, P=0.004) that was largely explained by differences in study type and serum 25(OH)D measurement method. In the present study, increased serum 25(OH)D level was significantly associated with lower all-cause mortality and lower cardiovascular mortality in dialysis patients.
Publisher: Elsevier BV
Date: 1996
DOI: 10.1038/KI.1996.36
Abstract: Carbapenem-resistant
Publisher: Elsevier BV
Date: 02-2020
Publisher: Elsevier BV
Date: 04-2016
DOI: 10.1016/J.BBRC.2016.03.048
Abstract: Apoptosis repressor with caspase recruitment domain (ARC), an endogenous inhibitor of apoptosis, is upregulated in a number of human cancers, thereby conferring drug resistance and giving a rationale for the inhibition of ARC to overcome drug resistance. Our hypothesis was that ARC would be similarly upregulated and targetable for therapy in renal cell carcinoma (RCC). Expression of ARC was assessed in 85 human RCC s les and paired non-neoplastic kidney by qPCR and immunohistochemistry, as well as in four RCC cell lines by qPCR, Western immunoblot and confocal microscopy. Contrary to expectations, ARC was significantly decreased in the majority of clear cell RCC and in three (ACHN, Caki-1 and 786-0) of the four RCC cell lines compared with the HK-2 non-cancerous human proximal tubular epithelial cell line. Inhibition of ARC with shRNA in the RCC cell line (SN12K1) that had shown increased ARC expression conferred resistance to Sunitinib, and upregulated interleukin-6 (IL-6) and vascular endothelial growth factor (VEGF). We therefore propose that decreased ARC, particularly in clear cell RCC, confers resistance to targeted therapy through restoration of tyrosine kinase-independent alternate angiogenesis pathways. Although the results are contrary to expectations from other cancer studies, they were confirmed here with multiple analytical methods. We believe the highly heterogeneous nature of cancers like RCC predicate that expression patterns of molecules must be interpreted in relation to respective matched non-neoplastic regions. In the current study, this procedure indicated that ARC is decreased in RCC.
Publisher: SAGE Publications
Date: 09-2013
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 27-11-2012
Publisher: Elsevier BV
Date: 07-2018
DOI: 10.1053/J.AJKD.2017.11.017
Abstract: Arteriovenous access failure frequently occurs in people on hemodialysis and is associated with morbidity, mortality and large healthcare expenditures. Omega-3 polyunsaturated fatty acids (omega-3 PUFA) may improve access outcomes via pleiotropic effects on access maturation and function, but may cause bleeding complications. Systematic review with meta-analysis. Adults requiring hemodialysis via arteriovenous fistula or graft. Trials evaluating omega-3 PUFA for arteriovenous access outcomes identified by searches in CENTRAL, MEDLINE, and Embase to 24 January 2017. Omega-3 PUFA. Primary patency loss, dialysis suitability failure, access abandonment, interventions to maintain patency or assist maturation, bleeding, gastrointestinal side-effects, all-cause and cardiovascular mortality, hospitalization, and treatment adherence. Treatment effects were summarized as relative risks (RR) and 95% confidence intervals (CI). Evidence was assessed using GRADE. Five eligible trials (833 participants) with a median follow-up of 12 months compared peri-operative omega-3 PUFA supplementation with placebo. One trial (n=567) evaluated treatment for fistulae and four (n=266) for grafts. Omega-3 PUFA supplementation prevented primary patency loss with moderate certainty (761 participants, RR 0.81, CI 0.68-0.98). Low quality evidence suggested, that omega-3 PUFA may have had little or no effect on dialysis suitability failure (536 participants, RR 0.95, CI 0.73-1.23), access abandonment (732 participants, RR 0.78, CI 0.59-1.03), need for interventions (732 participants, RR 0.82, CI 0.64-1.04), or all-cause mortality (799 participants, RR 0.99, CI 0.51-1.92). Bleeding risk (793 participants, RR 1.40, CI 0.78-2.49) or gastrointestinal side-effects (816 participants, RR 1.22, CI 0.64-2.34) from treatment were uncertain. There was no evidence of different treatment effects for grafts and fistulae. Small number and methodological limitations of included trials. Omega-3 PUFA supplementation probably protects against primary loss of arteriovenous access patency, but may have little or no effect on dialysis suitability failure, access interventions or access abandonment. Potential treatment harms are uncertain.
Publisher: Bentham Science Publishers Ltd.
Date: 05-2008
DOI: 10.2174/157488708784223853
Abstract: End stage kidney disease (ESKD) is associated with a 10- to 20-fold increased risk of cardiovascular mortality compared with age- and sex-matched controls without CKD. In spite of this marked increase in risk, the vast majority of cardiovascular intervention clinical trials to date have specifically excluded subjects with CKD. The aim of this paper is to critically review the recently published clinical trial evidence that cardiac outcomes in CKD patients are modified by cardiovascular risk factor interventions, including erythropoiesis stimulating agent therapy (US Normal Hematocrit, CHOIR and CREATE trials), statins (PPP, 4D and ALERT), fibrates (VA-HIT), folic acid (ASFAST, US folic acid trial, HOST), anti-oxidative stress therapy (SPACE, HOPE and ATIC), N-acetylcysteine, sevelamer (D-COR), cinacalcet (Cunningham meta-analysis), carvedilol, angiotensin converting enzyme inhibitor (FOSIDIAL), telmisartan, aspirin (HOT study re-analysis) and multidisciplinary multiple cardiovascular risk factor intervention clinics (LANDMARK). Although none of these studies could be considered conclusive, the negative trials to date should raise significant concerns about the heavy reliance of current clinical practice guidelines on extrapolation of findings from cardiovascular intervention trials in the general population. It may be that cardiovascular disease in dialysis populations is less amenable to intervention, either because of the advanced stage of CKD or because the pathogenesis of cardiovascular disease in CKD patients is different to that in the general population. Further large, well-conducted, multi-centre randomised controlled trials in this area are urgently required.
Publisher: SAGE Publications
Date: 05-2023
Publisher: Elsevier BV
Date: 10-2017
Publisher: Elsevier BV
Date: 10-2017
Publisher: Wiley
Date: 22-04-2016
DOI: 10.1111/NEP.12629
Abstract: Clinical outcomes of patients with end-stage kidney disease (ESKD) receiving renal replacement therapy (RRT) secondary to IgA nephropathy (IgAN) have not been well described. To investigate the characteristics, treatments and outcomes of ESKD because of kidney-limited IgAN and Henoch-Schönlein purpura nephritis (HSPN) in the Australian and New Zealand RRT populations. All ESKD patients who commenced RRT in Australia and New Zealand between 1971 and 2012 were included. Dialysis and transplant outcomes were evaluated in both a contemporary cohort (1998-2012) and the entire cohort (1971-2012). Of 63 297 ESKD patients, 3721 had kidney-limited IgAN, and 131 had HSPN. For the contemporary cohort of IgAN patients on dialysis (n = 2194), 10-year patient survival was 65%. Of 1368 contemporary IgAN patients who received their first renal allograft, 10-year patient, overall renal allograft and death-censored renal allograft survival were 93%, 82% and 88%, respectively. Using multivariable Cox regression analysis, patients with IgAN had favourable dialysis patient survival (adjusted hazard ratio (HR) 0.63, 95% confidence interval (CI) 0.57-0.69), overall renal allograft survival (HR 0.67, 95% CI 0.57-0.79) and renal transplant patient survival (HR 0.58, 95% CI 0.45-0.74) compared with ESKD controls. Similar results were found in the entire cohort and when using competing-risks models. Compared with kidney-limited IgAN patients, those with HSPN had worse dialysis patient survival (HR 1.94, 95% CI 1.02-3.69), overall renal allograft survival (HR 3.40, 95% CI 1.00-11.55) and renal transplant patient survival (HR 3.50, 95% CI 1.03-11.92). IgAN ESKD was associated with better dialysis and renal transplant outcomes compared with other forms of ESKD. The survival outcomes of ESKD patients with HSPN were worse than kidney-limited IgAN.
Publisher: Wiley
Date: 06-04-2016
DOI: 10.1111/JORC.12156
Abstract: Sub-optimal nutrition status is common amongst patients receiving peritoneal dialysis (PD) and leads to poor clinical outcome. This population experiences multi-factorial challenges to achieving optimal nutritional status, particularly driven by inadequate intake. The aim of this investigation was to identify factors associated with inadequate protein intake and sub-optimal nutritional status in patients undergoing PD. This was a cross-sectional study of 67 adult patients receiving PD (mean age 59 ± 14 years 57% male) within a single centre. Participants were consecutively recruited and interviewed by renal dietitians, collecting: Subjective Global Assessment (SGA) quality of life (using EQ-5D) dietary intake (via dietary interview) and appetite (using Appetite and Diet Assessment Tool). Participant demographics were obtained via survey or medical charts. Main outcome measures were inadequate dietary protein intake (<1.1 g/kg adjusted body weight/day) and malnutrition (as defined by SGA rating B or C). Overall, 15 (22%) patients were malnourished and 29 (43%) had inadequate protein intake. Poor appetite (anorexia) was reported in 62% (18/29) of participants with inadequate protein malnourished patients reported anorexia versus 12 (23%) of the well-nourished patients (p = 0.0001). Anorexia was a key risk factor for inadequate protein intake and malnutrition in patients undergoing PD. These findings highlight a need to closely monitor patients with appetite disturbances.
Publisher: Elsevier BV
Date: 07-2009
DOI: 10.1053/J.JRN.2008.11.006
Abstract: We investigated and compared diets and physical activity levels of renal transplant recipients (RTRs) with normal glucose tolerance (NGT) and abnormal glucose tolerance (AGT), and we identified clinical risk factors for AGT. This study was cross-sectional and observational. This study took place in a hospital's renal outpatient department. Patients included adult RTRs with NGT and AGT. All patients were assessed regarding age, body mass index (BMI), waist circumference (WC), waist/hip ratio (WHR), percent body fat (measured using dual-energy x-ray absorptiometry), dietary intake (3-day diet diary), and physical activity (PA) levels (total minutes/week, using the Physical Activity Statewide Questionnaire). The RTRs with AGT (n = 47) were significantly more obese (P = .04) and more centrally obese (P = .05) than RTRs with NGT (n = 35). The mean self-reported dietary macronutrient and energy intake was not significantly different between groups. However, the total amount of PA (median) was significantly lower in RTRs with AGT versus RTRs with NGT (255 [median, range 0 to 1940] versus 580 [median, 75 to 1095] minutes/week, respectively, P = .03), particularly in female RTRs (P = .007). After logistic regression analysis, total PA was identified as an independent predictor of AGT in all RTRs (beta = 0.940, R(2) = 0.090, P = .04). Percent body fat according to dual-energy x-ray absorptiometry was inversely associated with a high level of PA (>300 minutes/week) (beta = 0.906, R(2) = 0.211, P = .003). A higher amount of PA is associated with a lower risk of AGT in RTRs (particularly in females). An emphasis on increasing PA should be encouraged for all RTRs.
Publisher: Elsevier BV
Date: 07-2018
DOI: 10.1053/J.AJKD.2017.11.010
Abstract: Clinical trials are most informative for evidence-based decision making when they consistently measure and report outcomes of relevance to stakeholders. We aimed to assess the scope and consistency of outcomes reported in trials for hemodialysis. Systematic review. Adults requiring maintenance hemodialysis enrolled in clinical trials. All Cochrane systematic reviews of interventions published by August 29, 2016, and the trials published and registered in ClinicalTrials.gov since January 2011. Any hemodialysis-related interventions. Frequency and characteristics of the reported outcome domains and measures. From the 362 trials, we extracted and classified 10,713 outcome measures (a median of 21 [IQR, 10-39] per trial) into 81 different outcome domains, of which 42 (52%) were surrogate 25 (31%), clinical and 14 (17%), patient reported. The number of outcome measures reported significantly changed over time. The 5 most commonly reported domains were all surrogates: phosphate (125 [35%] trials), dialysis adequacy (120 [33%]), anemia (115 [32%]), inflammatory markers (114 [31%]), and calcium (109 [30%]). Mortality, cardiovascular diseases, and quality of life were reported very infrequently (73 [20%], 44 [12%], and 32 [9%], respectively). For feasibility, we included a s ling frame that included only trials identified in Cochrane systematic reviews or ClinicalTrials.gov. Outcomes reported in clinical trials involving adults receiving hemodialysis are focused on surrogate outcomes, rather than clinical and patient-centered outcomes. There is also extreme multiplicity and heterogeneity at every level: domain, measure, metric, and time point. Estimates of the comparative effectiveness of available interventions are unreliable and improvements over time have been inconsistent.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 10-2005
DOI: 10.1097/01.TP.0000173792.53561.B6
Abstract: Insulin resistance (IR) may be implicated in the pathogenesis of atherosclerosis in renal transplant recipients (RTRs) and be contributed to, in part, by free fatty acids (FFAs), produced in excess in centrally obese in iduals. The aim of this study was to determine the prevalence of IR and the relationships between FFAs, central obesity, and atherosclerosis in a cohort of prevalent RTRs. Observational data were collected on 85 RTRs (mean age 54 years 49% male, 87% Caucasian). Fasting serum was analyzed for FFAs, glucose, and insulin IR was calculated using the homeostasis model assessment (HOMA-IR) score. Vascular structure was assessed by carotid intima-media thickness (IMT) measurement. Linear regression analyses were performed to determine the factors associated with IR and atherosclerosis. IR occurred in 75% of RTRs, and FFA levels were independently associated with its occurrence (beta: -0.55, 95% CI: -1.02 to -0.07, P = 0.02). Other variables independently associated with IR were male sex, body mass index, central obesity, diabetes, systolic blood pressure and corticosteroid use. There was a significant correlation between FFA levels and IMT (r = 0.3, P=0.01). On multivariate analysis, IMT correlated with elevated FFA (beta: 0.07, 95% CI: 0.02-0.12, P = 0.007), diabetes mellitus (P = 0.05), older age (P 25 kg/m (P = 0.002). FFAs are associated with the development of IR and may be involved in the pathogenesis of atherosclerosis in RTRs. Additional studies are required to explore these associations further before considering whether an interventional trial aimed at lowering FFA would be a worthwhile undertaking.
Publisher: Elsevier BV
Date: 06-1998
Publisher: Elsevier BV
Date: 04-2021
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 03-2014
DOI: 10.1097/01.MNH.0000441046.13912.1F
Abstract: This review will examine the impact of neutral pH, low glucose degradation product (GDP) peritoneal dialysis fluid use on patient-level clinical outcomes in peritoneal dialysis patients. Recently published results from the balANZ trial and a meta-analysis suggest that the use of neutral pH, low GDP peritoneal dialysis fluids leads to better preservation of residual renal function, including residual diuresis, without added harmful effects. The impact of neutral pH, low GDP peritoneal dialysis fluids on other clinical outcomes (e.g. peritonitis) remains uncertain due to conflicting results from randomized controlled trials. A meta-analysis was unable to clarify this further due to generally suboptimal trial quality and insufficient statistical power. At present, based on the best available evidence, the use of neutral pH, low GDP peritoneal dialysis fluids is associated with some important clinical benefits without added harm. Further studies in the area are needed to establish the cost-effectiveness of this therapy and to clarify the effects of biocompatible fluids on patient-level outcomes, such as peritonitis, quality of life, technique survival and patient survival.
Publisher: Oxford University Press (OUP)
Date: 27-12-2010
DOI: 10.1093/NDT/GFP673
Abstract: Renal cell carcinoma (RCC) is a highly metastatic and lethal disease with few efficacious treatments. Many studies have shown that the ubiquitous transcription factor nuclear factor kappa B (NF-kappaB) plays a key role in the development and progression of many cancers including RCC. The aim of this investigation was to evaluate the anti-cancer effect of pyrrolidine dithiocarbamate (PDTC), a NF-kappaB inhibitor, in a murine xenograft model of RCC. The metastatic human RCC cell line, SN12K1, was inoculated into the left kidneys of severe combined immunodeficiency mice and the effect of semi-continuous PDTC treatment (50 mg/kg) on RCC growth analysed 5 weeks later. The analyses carried out in three groups (no treatment, RCC alone and RCC + PDTC) at 5 weeks were: renal weight, protein expression by immunohistochemistry and Western immunoblot, apoptosis (TdT-mediated nick end labelling and morphology) and mitosis (morphology). PDTC significantly decreased RCC growth and the expression of NF-kappaB subunits (p50, p52, c-Rel and RelB), upstream IKK-beta and IKK-gamma, but did not induce any changes in the expression of IkappaB-alpha and IkappaB-beta. RCC growth was associated with a significant decrease in the expression of the anti-apoptotic proteins Bcl-2 and Bcl-(XL) and increase in pro-apoptotic Bax, all of which were reversed by PDTC. Cell proliferation was significantly reduced by PDTC. The results demonstrate the potential anti-cancer benefits of treating NF-kappaB positive RCCs with NF-kappaB inhibitors like PDTC.
Publisher: SAGE Publications
Date: 05-07-2023
DOI: 10.1177/08968608231182885
Abstract: Peritoneal dialysis (PD)-related peritonitis is independently associated with low serum 25-hydroxy vitamin D [25(OH)D] levels. Our objective is to examine the feasibility of conducting a large, randomised controlled trial to determine the effects of vitamin D supplementation on the risk of PD-related peritonitis. Pilot, prospective, open-label randomised controlled trial. Peking University First Hospital, China. Patients receiving PD who had recovered from a recent episode of peritonitis between 30 September 2017 and 28 May 2020. Oral natural vitamin D supplementation (2000 IU per day) versus no vitamin D supplementation for 12 months. Primary outcomes were feasibility (recruitment success, retention, adherence, safety) and fidelity (change in serum 25(OH)D level during follow-up) for a large, randomised controlled trial in the future to determine the effects of vitamin D on PD-related peritonitis. Secondary outcomes were time to peritonitis occurrence and outcome of subsequent peritonitis. Overall, 60 among 151 patients were recruited (recruitment rate was 39.7%, 95% CI 31.9–47.5%, recruitment rate among eligible patients was 61.9%, 95% CI 52.2–71.5%). Retention and adherence rates were 100.0% (95% CI 100.0–100.0%) and 81.5% (95% CI 66.8–96.1%), respectively. During follow-up, serum 25(OH)D levels increased in the vitamin D (VD) group (from 19.25 ± 10.11 nmol/L to 60.27 ± 23.29 nmol/L after 6 months, p 0.001, n = 31), and remained higher ( p 0.001) than those in the control group ( n = 29). No differences were observed between the two groups with respect to time to subsequent peritonitis (hazard ratio 0.85, 95% CI 0.33–2.17) or any of the peritonitis outcomes. Adverse events were uncommon. A randomised controlled trial of the effect of vitamin D supplementation on peritonitis occurrence in patients receiving PD is feasible, safe and results in adequate serum 25(OH)D levels.
Publisher: No publisher found
Date: 2011
DOI: 10.1097/PEC.0B013E31821314CA 00006565-201104000-00006 [PII]
Publisher: Oxford University Press (OUP)
Date: 09-2001
Abstract: Concomitant iron supplementation is required in the great majority of erythropoietin (Epo)-treated patients with end-stage renal failure. Intravenous (i.v.) iron supplementation has been demonstrated to be superior to oral iron therapy in Epo-treated haemodialysis patients, but comparative data in iron-replete peritoneal dialysis (PD) patients are lacking. A 12-month, prospective, crossover trial comparing oral and i.v. iron supplementation was conducted in all Princess Alexandra Hospital PD patients who were on a stable dose of Epo, had no identifiable cause of impaired haemopoiesis other than uraemia, and had normal iron stores (transferrin saturation >20% and serum ferritin 100-500 mg/l). Patients received daily oral iron supplements (210 mg elemental iron per day) for 4 months followed by intermittent, outpatient i.v. iron infusions (200 mg every 2 months) for 4 months, followed by a further 4 months of oral iron. Haemoglobin levels and body iron stores were measured monthly. Twenty-eight in iduals were entered into the study and 16 patients completed 12 months of follow-up. Using repeated-measures analysis of variance, haemoglobin concentrations increased significantly during the i.v. phase (108+/-3 to 114+/-3 g/l) compared with each of the oral phases (109+/-3 to 108+/-3 g/l and 114+/-3 to 107+/-4 g/l, P<0.05). Similar patterns were seen for both percentage transferrin saturation (23.8+/-2.3 to 30.8+/-3.0%, 24.8+/-2.1 to 23.8+/-2.3%, and 30.8+/-3.0 to 26.8+/-2.1%, respectively, P<0.05) and ferritin (385+/-47 to 544+/-103 mg/l, 317+/-46 to 385+/-47 mg/l, 544+/-103 to 463+/-50 mg/l, respectively, P=0.10). No significant changes in Epo dosages were observed throughout the study. I.v. iron supplementation was associated with a much lower incidence of gastrointestinal disturbances (11 vs 46%, P<0.05), but exceeded the cost of oral iron treatment by 6.5-fold. Two-monthly i.v. iron infusions represent a practical alternative to oral iron and can be safely administered to PD patients in an outpatient setting. Compared with daily oral therapy, 2-monthly i.v. iron supplementation in PD patients was better tolerated and resulted in superior haemoglobin levels and body iron stores.
Publisher: Elsevier BV
Date: 2007
DOI: 10.1016/J.BIOCEL.2007.04.025
Abstract: The tubular epithelium of the kidney is susceptible to injury from many causes, such as ischemia-reperfusion and the associated oxidative stress, nephrotoxins, inflammatory and immune disorders and many others. The outcome is often acute kidney injury, which may progress to chronic kidney disease and fibrosis. Acute kidney injury involves not only direct injury to the distal tubular (DT) and proximal tubular (PT) epithelium during and immediately following the injurious event, but the closely-associated and sometimes dysfunctional renal vascular endothelium also plays an important part in modulating the tubular epithelial injury. In comparison with the PT, the DT epithelium is less sensitive to cell death, especially after ischemic injury. It is more prone to apoptosis than necrosis when it dies, and has key paracrine and autocrine functions in secreting an array of inflammatory, reparative, and survival cytokines that include chemotactic cytokines, polypeptide growth factors, and vasoactive peptides. In a neighborly way, the cytokines and growth factors secreted by the DT epithelium may then act positively on the ischemia-sensitive PT that has receptors to many of these proteins, but may not be able to synthesize them. A more complete understanding of these cellular events will allow protection against nephron destruction, regeneration leading to re-epithelialization of the injured tubules, or prevention of progression to chronic kidney disease. This review looks at these functions in the DT epithelial cells, specifically the cells in the medullary thick ascending limb of the loop of Henle, in contrast with those of the straight segment of the PT.
Publisher: BMJ
Date: 13-01-2021
DOI: 10.1136/BMJ.M4573
Abstract: To evaluate sodium-glucose cotransporter-2 (SGLT-2) inhibitors and glucagon-like peptide-1 (GLP-1) receptor agonists in patients with type 2 diabetes at varying cardiovascular and renal risk. Network meta-analysis. Medline, Embase, and Cochrane CENTRAL up to 11 August 2020. Randomised controlled trials comparing SGLT-2 inhibitors or GLP-1 receptor agonists with placebo, standard care, or other glucose lowering treatment in adults with type 2 diabetes with follow up of 24 weeks or longer. Studies were screened independently by two reviewers for eligibility, extracted data, and assessed risk of bias. Frequentist random effects network meta-analysis was carried out and GRADE (grading of recommendations assessment, development, and evaluation) used to assess evidence certainty. Results included estimated absolute effects of treatment per 1000 patients treated for five years for patients at very low risk (no cardiovascular risk factors), low risk (three or more cardiovascular risk factors), moderate risk (cardiovascular disease), high risk (chronic kidney disease), and very high risk (cardiovascular disease and kidney disease). A guideline panel provided oversight of the systematic review. 764 trials including 421 346 patients proved eligible. All results refer to the addition of SGLT-2 inhibitors and GLP-1 receptor agonists to existing diabetes treatment. Both classes of drugs lowered all cause mortality, cardiovascular mortality, non-fatal myocardial infarction, and kidney failure (high certainty evidence). Notable differences were found between the two agents: SGLT-2 inhibitors reduced admission to hospital for heart failure more than GLP-1 receptor agonists, and GLP-1 receptor agonists reduced non-fatal stroke more than SGLT-2 inhibitors (which appeared to have no effect). SGLT-2 inhibitors caused genital infection (high certainty), whereas GLP-1 receptor agonists might cause severe gastrointestinal events (low certainty). Low certainty evidence suggested that SGLT-2 inhibitors and GLP-1 receptor agonists might lower body weight. Little or no evidence was found for the effect of SGLT-2 inhibitors or GLP-1 receptor agonists on limb utation, blindness, eye disease, neuropathic pain, or health related quality of life. The absolute benefits of these drugs vary substantially across patients from low to very high risk of cardiovascular and renal outcomes (eg, SGLT-2 inhibitors resulted in 3 to 40 fewer deaths in 1000 patients over five years see interactive decision support tool ( atch-it/200820dist/#!/ ) for all outcomes. In patients with type 2 diabetes, SGLT-2 inhibitors and GLP-1 receptor agonists reduced cardiovascular and renal outcomes, with some differences in benefits and harms. Absolute benefits are determined by in idual risk profiles of patients, with clear implications for clinical practice, as reflected in the BMJ Rapid Recommendations directly informed by this systematic review. PROSPERO CRD42019153180.
Publisher: Elsevier BV
Date: 02-2014
DOI: 10.1016/J.BBRC.2014.01.047
Abstract: The use of recombinant human erythropoietin (rhEPO) to promote repair and minimize cardiac hypertrophy after myocardial infarction has had disappointing outcomes in clinical trials. We hypothesized that the beneficial non-hematopoietic effects of rhEPO against cardiac hypertrophy could be offset by the molecular changes initiated by rhEPO itself, leading to rhEPO resistance or maladaptive hypertrophy. This hypothesis was investigated using an isoproterenol-induced model of myocardial infarct and cardiac remodelling with emphasis on hypertrophy. In h9c2 cardiomyocytes, rhEPO decreased isoproterenol-induced hypertrophy, and the expression of the pro-fibrotic factors fibronectin, alpha smooth muscle actin and transforming growth factor beta-1 (TGF-β1). In contrast, by itself, rhEPO increased the expression of fibronectin and TGF-β1. Exogenous TGF-β1 induced a significant increase in hypertrophy, which was further potentiated by rhEPO. Exogenous fibronectin not only induced hypertrophy of cardiomyocytes, but also conferred resistance to rhEPO treatment. Based on these findings we propose that the outcome of rhEPO treatment for myocardial infarction is determined by the baseline concentrations of fibronectin and TGF-β1. If endogenous fibronectin or TGF-β levels are above a certain threshold, they could cause resistance to rhEPO therapy and enhancement of cardiac hypertrophy, respectively, leading to maladaptive hypertrophy.
Publisher: Elsevier BV
Date: 12-2007
DOI: 10.1053/J.AJKD.2007.08.015
Abstract: Peritonitis frequently complicates peritoneal dialysis. Appropriate treatment is essential to reduce adverse outcomes. Available trial evidence about peritoneal dialysis peritonitis treatment was evaluated. The Cochrane CENTRAL Registry (2005 issue), MEDLINE (1966 to February 2006), EMBASE (1985 to February 2006), and reference lists were searched to identify randomized trials of treatments for patients with peritoneal dialysis peritonitis. Trials of antibiotics (comparisons of routes, agents, and dosing regimens), fibrinolytic agents, peritoneal lavage, and intraperitoneal immunoglobulin. Treatment failure, relapse, catheter removal, microbiological eradication, hospitalization, all-cause mortality, and adverse reactions. 36 eligible trials were identified: 30 trials (1,800 patients) of antibiotics 4 trials (229 patients) of urokinase 1 trial of peritoneal lavage (36 patients) and 1 trial of intraperitoneal immunoglobulin (24 patients). No superior antimicrobial class was identified. In particular, glycopeptides and first-generation cephalosporins were equivalent (3 trials, 387 patients relative risk [RR], 1.84 95% confidence interval [CI], 0.95 to 3.58). Simultaneous catheter removal/replacement was superior to urokinase at decreasing treatment failures (1 trial, 37 patients RR, 2.35 95% CI, 1.13 to 4.91). Continuous and intermittent intraperitoneal antibiotic dosing were equivalent regarding treatment failure (4 trials, 338 patients RR, 0.69 95% CI, 0.37 to 1.30) and relapse (4 trials, 324 patients RR, 0.93 95% CI, 0.63 to 1.39). One trial showed superiority of intraperitoneal antibiotics over intravenous therapy. The method quality of trials generally was suboptimal and outcome definitions were inconsistent. Small patient numbers led to inadequate power to show an effect. Interventions, such as optimal duration of antibiotic therapy, were not evaluated. Trials did not identify superior antibiotic regimens. Intermittent and continuous antibiotic dosing are equivalent treatment strategies.
Publisher: American Physiological Society
Date: 03-1999
DOI: 10.1152/AJPRENAL.1999.276.3.F467
Abstract: To investigate the possibility that 3-hydroxy-3-methylglutaryl CoA (HMGCoA) reductase inhibitors ameliorate renal disease via direct effects on the tubulointerstitium, primary cultures of human proximal tubule cells (PTC) and renal cortical fibroblasts (CF) were exposed for 24 h to simvastatin (0.1–10 μmol/l) under basal conditions and in the presence of 1,000 ng/ml of cyclosporin (CsA), which we have previously shown to promote in vitro interstitial matrix accumulation at least partially via activation of local cytokine networks. Simvastatin, in micromolar concentrations, engendered cholesterol-independent inhibition of CF and PTC thymidine incorporation and cholesterol-dependent suppression of PTC apical Na + /H + exchange (NHE) (ethylisopropylamiloride-sensitive apical 22 Na + uptake). Similarly, CF secretion of insulin-like growth factor-I (IGF-I) and IGF binding protein-3 were depressed, whereas CF collagen synthesis ([ 3 H]proline incorporation) and PTC secretion of the fibrogenic cytokines, transforming growth factor-β1, and platelet-derived growth factor were unaffected. A lower concentration (0.1 μmol/l) of simvastatin did not affect any of the above parameters under basal conditions but completely prevented CsA-stimulated CF collagen synthesis (control, 6.6 ± 0.6 CsA, 8.3 ± 0.6 CsA+simvastatin, 6.2 ± 0.5% P 0.05) and IGF-I secretion (89.5 ± 16.6, 204.7 ± 57.0, and 94.6 ± 22.3 ng ⋅ mg protein −1 ⋅ day −1 , respectively P 0.05). The results suggest that simvastatin exerts direct cholesterol-dependent and -independent effects on the human kidney tubulointerstitium. HMGCoA reductase inhibitors may ameliorate interstitial fibrosis complicating CsA therapy via direct actions on human renal cortical fibroblasts.
Publisher: SAGE Publications
Date: 23-04-2020
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 15-12-2021
DOI: 10.2215/CJN.09070620
Abstract: People with kidney failure typically receive KRT in the form of dialysis or transplantation. However, studies have suggested that not all patients with kidney failure are best suited for KRT. Additionally, KRT is costly and not always accessible in resource-restricted settings. Conservative kidney management is an alternate kidney failure therapy that focuses on symptom management, psychologic health, spiritual care, and family and social support. Despite the importance of conservative kidney management in kidney failure care, several barriers exist that affect its uptake and quality. The Global Kidney Health Atlas is an ongoing initiative of the International Society of Nephrology that aims to monitor and evaluate the status of global kidney care worldwide. This study reports on findings from the 2018 Global Kidney Health Atlas survey, specifically addressing the availability, accessibility, and quality of conservative kidney management. Respondents from 160 countries completed the survey, and 154 answered questions pertaining to conservative kidney management. Of these, 124 (81%) stated that conservative kidney management was available. Accessibility was low worldwide, particularly in low-income countries. Less than half of countries utilized multidisciplinary teams (46%) utilized shared decision making (32%) or provided psychologic, cultural, or spiritual support (36%). One-quarter provided relevant health care providers with training on conservative kidney management delivery. Overall, conservative kidney management is available in most countries however, it is not optimally accessible or of the highest quality.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 04-2002
DOI: 10.1097/00007890-200204150-00027
Abstract: Mycophenolate mofetil (MMF) is a potent immunosuppressive agent that has been shown to be superior to azathioprine in preventing early acute rejection in the general renal transplant population. However, it is uncertain whether these benefits also apply to older renal transplant recipients, who are known to be more susceptible to infectious complications and have considerably lower rates of rejection and immunological graft loss. A retrospective analysis was undertaken of all elderly (> or =55 years old) renal transplant recipients who underwent renal transplantation at the Princess Alexandra Hospital (1994-2000) and received either MMF (n=60) or azathioprine (n=55) in combination with prednisolone and cyclosporin. Data were analyzed on an intention-to-treat basis using a multivariate Cox proportional hazards model. The azathioprine- and MMF-treated groups were well matched at baseline with respect to demographic characteristics, end-stage renal failure causes and transplant characteristics. Compared with the MMF cohort, azathioprine-treated patients experienced a shorter time to first rejection [hazard ratio (HR) 4.47, 95% CI 1.53-13.1, P<0.01]. However, azathioprine-treated patients were also less likely to develop opportunistic infections (HR 0.11, 95% CI 0.03-0.41, P=0.001). No differences were observed between the two groups with respect to hospitalization rates, intensive care admissions, hematological complications, or posttransplant malignancies. Actuarial 2-year survival rates for the azathioprine- and MMF-treated patients were 100 and 87%, respectively (P<0.001). The principal cause of death in the MMF cohort was infection. Using a multivariate Cox regression analysis of patient survival, an adjusted hazard ratio of 0.01 (95% CI 0.001-0.08, P=0.001) was calculated in favor of azathioprine. Overall graft survival also tended to be better in patients receiving azathioprine (HR 0.27, 95% CI 0.06-1.33, P=0.11), In elderly renal transplant recipients, the combination of MMF, cyclosporin, and prednisolone appears to result in a worse outcome compared with the less potent combination of azathioprine, cyclosporin, and prednisolone. Future prospective studies need to specifically evaluate the risk/benefit ratios of newer, more potent immunosuppressive protocols, such as MMF-based regimens, in this important and sizeable patient subgroup.
Publisher: Elsevier BV
Date: 11-2005
DOI: 10.1111/J.1600-6143.2005.01073.X
Abstract: Obesity is associated with adverse cardiovascular (CV) parameters and may be involved in the pathogenesis of allograft dysfunction in renal transplant recipients (RTR). We sought the spectrum of body mass index (BMI) and the relationships between BMI, CV parameters and allograft function in prevalent RTR. Data were collected at baseline and 2 years on 90 RTR (mean age 51 years, 53% male, median transplant duration 7 years), categorized by BMI (normal, BMI or = 30 kg/m2). Proteinuria and glomerular filtration rate (eGFR(MDRD)) were determined. Nine percent RTR were obese pre-transplantation compared to 30% at baseline (p < 0.001) and follow-up (25 +/- 2 months). As BMI increased, prevalence of metabolic syndrome and central obesity increased (12 vs 48 vs 85%, p < 0.001 and 3 vs 42 vs 96%, p < 0.001, respectively). Systolic blood pressure, fasting blood glucose and lipid parameters changed significantly with BMI category and over time. Proteinuria progression occurred in 65% obese RTR (23 (13-59 g/mol creatinine) to 59 (25-120 g/mol creatinine)). BMI was independently associated with proteinuria progression (beta 0.01, p = 0.008) but not with changing eGFR(MDRD.) In conclusion, obesity is common in RTR and is associated with worsening CV parameters and proteinuria progression.
Publisher: Elsevier BV
Date: 09-2013
DOI: 10.1053/J.AJKD.2013.02.369
Abstract: Depression occurs relatively commonly in people with chronic kidney disease (CKD), but it is uncertain whether depression is a risk factor for premature death in this population. Interventions to reduce mortality in CKD consistently have been ineffective and new strategies are needed. Systematic review and meta-analysis of cohort studies. Adults with CKD. Cohort studies identified in Ovid MEDLINE through week 3 December 2012 without language restriction. Depression status as determined by physician diagnosis, clinical coding, or self-reported scales. All-cause and cardiovascular mortality. Outcomes were summarized as relative risks (RRs) with 95% CIs using random-effects meta-analysis. 22 studies (83,381 participants) comprising 12,063 cases of depression (mean prevalence, 27.4% 95% CI, 20.0%-36.3%) with a follow-up of 3 months to 6.5 years were included. Methodological quality generally was good or fair. Depression consistently increased the risk of death from any cause (RR, 1.59 95% CI, 1.35-1.87), but had less certain effects on cardiovascular mortality (RR, 1.88 95% CI, 0.84-4.19). Associations for mortality were similar regardless of the diagnostic method used for depression, but were weaker in analyses controlled for preexisting cardiovascular disease (RR, 1.36 95% CI, 1.23-1.50). Meta-analyses adjusting for antidepressant medication use were not possible, and data for kidney transplant recipients and in iduals with earlier stages of CKD not treated with dialysis were limited. Depression is associated with a substantially increased risk of death in people with CKD. Effective treatment for depression in people with CKD may reduce mortality.
Publisher: S. Karger AG
Date: 2020
DOI: 10.1159/000507388
Abstract: b i Introduction: /i /b The significance of N-terminal pro-B type natriuretic peptide (NT-proBNP) to detect heart failure in patients with end-stage kidney diseases on dialysis is controversial. b i Objective: /i /b To assess whether serial measurements of NT-proBNP can predict worsening cardiac function in dialysis patients. b i Methods: /i /b In this prospective, longitudinal, observational cohort study, the relationship between changes in monthly plasma NT-proBNP concentrations and changes in echocardiographic indices (left ventricular global longitudinal strain [GLS] and ejection fraction [LVEF]) were analyzed in dialysis patients without symptoms of heart failure over 24 months using multilevel mixed effects models. b i Results: /i /b The study included 40 dialysis patients who were followed for a median period of 24 months. Logarithmically transformed baseline plasma NT-proBNP levels were correlated positively with GLS ( i r /i = 0.48, i /i = 0.002) and negatively with LVEF ( i r /i = –0.44, i /i = 0.005). Time-averaged and maximum NT-proBNP values during the echocardiogram intervals were significantly correlated with GLS and LVEF over time. Every 1-unit increase in average NT-proBNP level during the echocardiogram interval was associated with a 0.99 (95% confidence interval, 0.41–1.56) higher GLS (%) and 2.90 (1.22–4.57) lower LVEF (%). Every 1-unit increase in maximum NT-proBNP level was associated with a 0.90 (0.35–1.45) higher GLS (%) and 2.67 (1.03–4.30) lower LVEF (%). This increase in GLS indicates a reduction in systolic performance. b i Conclusions: /i /b Our cohort study demonstrated that serial plasma NT-proBNP concentrations may be useful for early identification of in iduals with worsening cardiac function over time.
Publisher: Wiley
Date: 22-06-2016
DOI: 10.1111/NEP.12731
Abstract: This paper updates a previous 'Call to Action' paper (Nephrology 2011 16: 19-29) that reviewed key outcome data for Australian and New Zealand peritoneal dialysis patients and made recommendations to improve care. Since its publication, peritonitis rates have improved significantly, although they have plateaued more recently. Peritoneal dialysis patient and technique survival in Australian and New Zealand have also improved, with a reduction in the proportion of technique failures attributed to 'social reasons'. Despite these improvements, technique survival rates overall remain lower than in many other parts of the world. This update includes additional practical recommendations based on published evidence and emerging initiatives to further improve outcomes.
Publisher: Hindawi Limited
Date: 2012
DOI: 10.1155/2012/812609
Abstract: Peritoneal dialysis (PD) is a preferred home dialysis modality and has a number of added advantages including improved initial patient survival and cost effectiveness over haemodialysis. Despite these benefits, uptake of PD remains relatively low, especially in developed countries. Wider implementation of PD is compromised by higher technique failure from infections (e.g., PD peritonitis) and ultrafiltration failure. These are inevitable consequences of peritoneal injury, which is thought to result primarily from continuous exposure to PD fluids that are characterised by their “unphysiologic” composition. In order to overcome these barriers, a number of more biocompatible PD fluids, with neutral pH, low glucose degradation product content, and bicarbonate buffer have been manufactured over the past two decades. Several preclinical studies have demonstrated their benefit in terms of improvement in host cell defence, peritoneal membrane integrity, and cytokine profile. This paper aims to review randomised controlled trials assessing the use of biocompatible PD fluids and their effect on clinical outcomes.
Publisher: Springer Science and Business Media LLC
Date: 19-04-2013
Publisher: Elsevier BV
Date: 08-2010
DOI: 10.1038/KI.2010.149
Abstract: Non-Pseudomonas Gram-negative (NPGN) peritonitis is a frequent, serious complication of peritoneal dialysis however, previous reports have been limited to small, single-center studies. To gain insight on the frequency, predictors, treatment, and outcomes of NPGN peritonitis, we analyzed data in the ANZDATA registry of all adult Australian peritoneal dialysis patients over a 39-month period using multivariate logistic and multilevel Poisson regressions. There were 837 episodes of NPGN peritonitis (23.3% of all peritonitis) that occurred in 256 patients. The most common organism isolated was Escherichia coli, but included Klebsiella, Enterobacter, Serratia, Acinetobacter, Proteus, and Citrobacter, with multiple organisms identified in a quarter of the patients. The principal risk factor was older age, with poorer clinical outcome predicted by older age and polymicrobial peritonitis. The overall antibiotic cure rate was 59%. NPGN peritonitis was associated with significantly higher risks of hospitalization, catheter removal, permanent transfer to hemodialysis, and death compared to other organisms contributing to peritonitis. Underlying bowel perforation requiring surgery was uncommon. Hence, we show that NPGN peritonitis is a frequent, serious complication of peritoneal dialysis, which is frequently associated with significant risks, including death. Its cure with antibiotics alone is less likely when multiple organisms are involved.
Publisher: Elsevier BV
Date: 05-2011
DOI: 10.1053/J.AJKD.2010.12.018
Abstract: Planned early initiation of dialysis therapy based on estimated kidney function does not influence mortality and major comorbid conditions, but amelioration of symptoms may improve quality of life and decrease costs. Patients with progressive chronic kidney disease and a Cockcroft-Gault estimated glomerular filtration rate of 10-15 mL/min/1.73 m(2) were randomly assigned to start dialysis therapy at a glomerular filtration rate of either 10-14 (early start) or 5-7 mL/min/1.73 m(2) (late start). Of the original 828 patients in the IDEAL (Initiation of Dialysis Early or Late) Trial in renal units in Australia and New Zealand, 642 agreed to participate in this cost-effectiveness study. STUDY PERSPECTIVE & TIMEFRAME: A societal perspective was taken for costs. Patients were enrolled between July 1, 2000, and November 14, 2008, and followed up until November 14, 2009. Planned earlier start of maintenance dialysis therapy. Difference in quality of life and costs. Median follow-up of patients (307 early start, 335 late start) was 4.15 years, with a 6-month difference in median duration of dialysis therapy. Mean direct dialysis costs were significantly higher in the early-start group ($10,777 95% CI, $313 to $22,801). Total costs, including costs for resources used to manage adverse events, were higher in the early-start group ($18,715 95% CI, -$3,162 to $43,021), although not statistically different. Adjusted for differences in baseline quality of life, the difference in quality-adjusted survival between groups over the time horizon of the trial was not statistically different (0.02 full health equivalent years 95% CI, -0.09 to 0.14). Missing quality-of-life questionnaires and skewed cost data, although similar in each group, decrease the precision of results. Planned early initiation of dialysis therapy in patients with progressive chronic kidney disease has higher dialysis costs and is not associated with improved quality of life.
Publisher: Elsevier BV
Date: 10-2000
Publisher: Public Library of Science (PLoS)
Date: 12-05-2011
Publisher: Wiley
Date: 11-2006
Abstract: The aim of this study was to evaluate dosing schedules of gentamicin in patients with end-stage renal disease and receiving hemodialysis. Forty-six patients were recruited who received gentamicin while on hemodialysis. Each patient provided approximately 4 blood s les at various times before and after dialysis for analysis of plasma gentamicin concentrations. A population pharmacokinetic model was constructed using NONMEM (version 5). The clearance of gentamicin during dialysis was 4.69 L/h and between dialysis was 0.453 L/h. The clearance between dialysis was best described by residual creatinine clearance (as calculated using the Cockcroft and Gault equation), which probably reflects both lean mass and residual clearance mechanisms. Simulation from the final population model showed that predialysis dosing has a higher probability of achieving target maximum concentration (Cmax) concentrations (> 8 mg/L) within acceptable exposure limits (area under the concentration-time curve [AUC] values > 70 and < 120 mg x h/L per 24 hours) than postdialysis dosing.
Publisher: Oxford University Press (OUP)
Date: 22-01-2010
DOI: 10.1093/NDT/GFP780
Abstract: Automated peritoneal dialysis (APD) is widely recommended for the management of high transporters by the International Society of Peritoneal Dialysis (ISPD), although there have been no adequate studies to date comparing the outcomes of APD and continuous ambulatory peritoneal dialysis (CAPD) in this high risk group. The relative impact of APD versus CAPD on patient and technique survival rates was examined by both intention-to-treat (PD modality at Day 90) and 'as-treated' time-varying Cox proportional hazards model analyses in all patients who started PD in Australia or New Zealand between 1 April 1999 and 31 March 2004 and who had baseline peritoneal equilibration tests confirming the presence of high peritoneal transport status. During the study period, 4128 patients commenced PD. Of these, 628 patients were high transporters on PD at Day 90 (486 on APD and 142 on CAPD). Compared to high transporters treated with CAPD, APD-treated high transporters were more likely to be younger and Caucasian, and less likely to be diabetic. On multivariate intention-to-treat analysis, APD treatment was associated with superior survival [adjusted hazard ratio (HR) 0.56, 95% confidence interval (CI) 0.35-0.87] and comparable death-censored technique survival (HR 0.88, 95% CI 0.64-1.21). Superior survival of high transporters treated with APD versus CAPD was also confirmed in supplemental as-treated analysis (HR 0.72, 95% CI 0.54-0.96), matched case-control analysis (HR 0.60, 95% CI 0.36-0.96) and subgroup analysis of high transporters treated entirely with APD versus those treated entirely with CAPD (HR 0.29, 95% CI 0.14-0.60). There were no statistically significant differences in patient survival or death-censored technique survival between APD and CAPD for any other transport group, except for low transporters, who experienced a higher mortality rate on APD compared with CAPD (HR 2.19, 95% CI 1.02-4.70). APD treatment is associated with a significant survival advantage in high transporters compared with CAPD. However, APD treatment is associated with inferior survival in low transporters.
Publisher: Springer Science and Business Media LLC
Date: 08-07-2016
Publisher: Wiley
Date: 16-07-2019
Abstract: ED access block is an ongoing significant problem and has been associated with excess mortality. Multiple models of care have been studied in an effort to improve access block and other key performance indicators (KPIs) of ED. This present study describes the impact of a new model of care using an ED led, consultant run clinical decision unit (CDU) on performance, using a retrospective analysis of data for 9 month periods before and after the introduction of the CDU model of care. Primary outcomes were access block (percentage of patients admitted >8 h), discharge National Emergency Access Target (NEAT) adherence and Queensland Ambulance Service level three escalations. After the implementation of the CDU, access block significantly improved. There was a significant improvement in NEAT adherence. Total ambulance r ing time fell by 58% and ambulance service level three escalations fell from 21 to 5 post-CDU implementation. Overall there was no change to hospital mortality numbers. The percentage of patients that did not wait and 30 day representations showed a small but statistically significant decrease. In summary, this ED led, consultant run CDU model of care resulted in significantly improved performance on a range of KPIs, including improvement in access block and NEAT figures. The substantial improvements in ambulance r ing and escalations also indicated that the department was able to cope better with periods of high activity.
Publisher: Elsevier BV
Date: 06-2013
DOI: 10.1053/J.AJKD.2012.08.045
Abstract: Most patients with end-stage renal disease require dialysis to survive because they are unable to access kidney transplantation. Peritoneal dialysis (PD) is recommended by some clinical practice guidelines as the dialysis treatment of choice for adults without significant comorbid conditions or those with residual kidney function. This study aims to synthesize published qualitative studies of patients' experiences, beliefs, and attitudes about PD. We conducted a systematic review and thematic synthesis of qualitative studies of adult perspectives of living with PD. Databases (MEDLINE, Embase, PsycINFO, and CINAHL), theses, and reference lists were searched to November 2011. 39 studies involving 387 participants were included. We identified 7 themes: resilience and confidence (determination and overcoming vicissitudes), support structures (strong family relationship, peer support, professional dedication, social abandonment, and desire for holistic care), overwhelming responsibility (disruptive intrusion, family burden, and onerous treatment regimen), control (gaining bodily awareness, achieving independence and self-efficacy, and information seeking), freedom (flexibility and autonomy, retaining social functioning, and ability to travel), sick identity (damage to self-esteem and invisible suffering), and disablement (physical incapacitation and social loss and devaluation). PD can offer patients a sense of control, independence, self-efficacy, and freedom. However, holistic and multidisciplinary care is needed to mitigate the risks of impaired self-esteem, physical incapacitation, reduced social functioning, and poor sense of self-worth. Strategies that aim to strengthen social support and promote resilience and confidence in patients are integral to achieving positive adjustment, improved psychosocial outcomes, and treatment satisfaction.
Publisher: Elsevier BV
Date: 10-1995
DOI: 10.1016/0272-6386(95)90605-3
Abstract: We describe the rapid and dramatic improvement in gastrointestinal function that occurred after successful renal transplantation in a women with severe sclerosing peritonitis secondary to continuous ambulatory peritoneal dialysis (CAPD). We postulate that the antiinflammatory effect of the immunosuppressive agents was the most important factor leading to the patient's recovery.
Publisher: Elsevier BV
Date: 09-2017
DOI: 10.1053/J.AJKD.2016.12.008
Abstract: Intravenous (IV) cyclophosphamide has been first-line treatment for inducing disease remission in lupus nephritis. The comparative efficacy and toxicity of newer agents such as mycophenolate mofetil (MMF) and calcineurin inhibitors are uncertain. Network meta-analysis. Patients with proliferative lupus nephritis. Randomized trials of immunosuppression to induce or maintain disease remission. IV cyclophosphamide, oral cyclophosphamide, MMF, calcineurin inhibitor, plasma exchange, rituximab, or azathioprine, alone or in combination. Complete remission, end-stage kidney disease, all-cause mortality, doubling of serum creatinine level, relapse, and adverse events. 53 studies involving 4,222 participants were eligible. Induction and maintenance treatments were administered for 12 (IQR, 6-84) and 25 (IQR, 12-48) months, respectively. There was no evidence of different effects between therapies on all-cause mortality, doubling of serum creatinine level, or end-stage kidney disease. Compared to IV cyclophosphamide, the most effective treatments to induce remission in moderate- to high-quality evidence were combined MMF and calcineurin inhibitor therapy, calcineurin inhibitors, and MMF (ORs were 2.69 [95% CI, 1.74-4.16], 1.86 [95% CI, 1.05-3.30], and 1.54 [95% CI, 1.04-2.30], respectively). MMF was significantly less likely than IV cyclophosphamide to cause alopecia (OR, 0.21 95% CI, 0.12-0.36), and MMF combined with calcineurin inhibitor therapy was less likely to cause ovarian failure (OR, 0.25 95% CI, 0.07-0.93). Regimens generally had similar odds of major infection. MMF was the most effective strategy to maintain remission. Outcome definitions not standardized, short duration of follow-up, and possible confounding by previous or subsequent therapy. Evidence for induction therapy for lupus nephritis is inconclusive based on treatment effects on all-cause mortality, doubling of serum creatinine level, and end-stage kidney disease. MMF, calcineurin inhibitors, or their combination were most effective for inducing remission compared to IV cyclophosphamide, while conferring similar or lower treatment toxicity. MMF was the most effective maintenance therapy.
Publisher: Springer Science and Business Media LLC
Date: 12-2012
Abstract: The aim of this study was to investigate the characteristics and outcomes of patients receiving renal replacement therapy for end-stage kidney disease (ESKD) secondary to haemolytic uraemic syndrome (HUS). The study included all patients with ESKD who commenced renal replacement therapy in Australia and New Zealand between 15/5/1963 and 31/12/2010, using data from the ANZDATA Registry. HUS ESKD patients were compared with matched controls with an alternative primary renal disease using propensity scores based on age, gender and treatment era. Of the 58422 patients included in the study, 241 (0.4%) had ESKD secondary to HUS. HUS ESKD was independently associated with younger age, female gender and European race. Compared with matched controls, HUS ESKD was not associated with mortality on renal replacement therapy (adjusted hazard ratio [HR] 1.14, 95% CI 0.87-1.50, p = 0.34) or dialysis (HR 1.34, 95% CI 0.93-1.93, p = 0.12), but did independently predict recovery of renal function (HR 54.01, 95% CI 1.45-11.1, p = 0.008). 130 (54%) HUS patients received 166 renal allografts. Overall renal allograft survival rates were significantly lower for patients with HUS ESKD at 1 year (73% vs 91%), 5 years (62% vs 85%) and 10 years (49% vs 73%). HUS ESKD was an independent predictor of renal allograft failure (HR 2.59, 95% CI 1.70-3.95, p 0.001). Sixteen (12%) HUS patients experienced failure of 22 renal allografts due to recurrent HUS. HUS ESKD was not independently associated with the risk of death following renal transplantation (HR 0.92, 95% CI 0.35-2.44, p = 0.87). HUS is an uncommon cause of ESKD, which is associated with comparable patient survival on dialysis, an increased probability of renal function recovery, comparable patient survival post-renal transplant and a heightened risk of renal transplant graft failure compared with matched ESKD controls.
Publisher: Elsevier BV
Date: 2029
DOI: 10.1016/J.IJANTIMICAG.2022.106691
Abstract: There is uncertainty about whether piperacillin/tazobactam (PT) increases the risk of AKI in patients without concomitant use of vancomycin. We compared risk of hospital-acquired acute kidney injury (HA-AKI) among adults treated with PT or anti-pseudomonal β-lactams (meropenem, ceftazidime), without concomitant use of vancomycin. This real-world study analyzed the data from China Renal Data System (CRDS) and assessed the HA-AKI risk in adults hospitalized with infection after exposure to PT or meropenem or ceftazidime, in the absence of concomitant vancomycin. The primary outcome was any stage of HA-AKI according to the Kidney Disease Improving Global Outcomes guidelines. A multivariable cox regression model and different propensity score matching models were used. Among the 29,441 adults (mean [SD] age, 62.44 [16.84] years 17,980 female [61.1%]) included in this study, 14,721 (50%) used PT, 9,081 (31%) used meropenem and 5,639 (19%) ceftazidime. During a median follow-up period of 8 days, 3,476 (6.9%) developed HA-AKI. Use of PT was not associated with statistically increased risk of HA-AKI compared with meropenem [adjusted hazard ratio [aHR]: 1.07, 95%CI 0.97-1.19], ceftazidime [aHR: 1.09, 95%CI 0.92-1.3] or both agents (aHR: 1.07, 95% CI 0.97-1.17) after adjusting for confounders. Results were consistent in stratified analyses, propensity score (PS) matching using logistics regression or random forest methods to generate a PS, and in an analysis restricting outcomes to AKI stage 2-3. Without concomitant vancomycin use, the risk of AKI following PT therapy is comparable with that of meropenem or ceftazidime among adults hospitalized with infection.
Publisher: Massachusetts Medical Society
Date: 10-03-2011
DOI: 10.1056/NEJMC1100105
Publisher: Wiley
Date: 27-01-2021
Publisher: Springer Science and Business Media LLC
Date: 19-08-2015
Publisher: Elsevier BV
Date: 03-2007
Publisher: American Medical Association (AMA)
Date: 19-07-2016
Abstract: Numerous glucose-lowering drugs are used to treat type 2 diabetes. To estimate the relative efficacy and safety associated with glucose-lowering drugs including insulin. Cochrane Library Central Register of Controlled Trials, MEDLINE, and EMBASE databases through March 21, 2016. Randomized clinical trials of 24 weeks' or longer duration. Random-effects network meta-analysis. The primary outcome was cardiovascular mortality. Secondary outcomes included all-cause mortality, serious adverse events, myocardial infarction, stroke, hemoglobin A1c (HbA1C) level, treatment failure (rescue treatment or lack of efficacy), hypoglycemia, and body weight. A total of 301 clinical trials (1,417,367 patient-months) were included 177 trials (56,598 patients) of drugs given as monotherapy 109 trials (53,030 patients) of drugs added to metformin (dual therapy) and 29 trials (10,598 patients) of drugs added to metformin and sulfonylurea (triple therapy). There were no significant differences in associations between any drug class as monotherapy, dual therapy, or triple therapy with odds of cardiovascular or all-cause mortality. Compared with metformin, sulfonylurea (standardized mean difference [SMD], 0.18 [95% CI, 0.01 to 0.34]), thiazolidinedione (SMD, 0.16 [95% CI, 0.00 to 0.31]), DPP-4 inhibitor (SMD, 0.33 [95% CI, 0.13 to 0.52]), and α-glucosidase inhibitor (SMD, 0.35 [95% CI, 0.12 to 0.58]) monotherapy were associated with higher HbA1C levels. Sulfonylurea (odds ratio [OR], 3.13 [95% CI, 2.39 to 4.12] risk difference [RD], 10% [95% CI, 7% to 13%]) and basal insulin (OR, 17.9 [95% CI, 1.97 to 162] RD, 10% [95% CI, 0.08% to 20%]) were associated with greatest odds of hypoglycemia. When added to metformin, drugs were associated with similar HbA1C levels, while SGLT-2 inhibitors offered the lowest odds of hypoglycemia (OR, 0.12 [95% CI, 0.08 to 0.18] RD, -22% [-27% to -18%]). When added to metformin and sulfonylurea, GLP-1 receptor agonists were associated with the lowest odds of hypoglycemia (OR, 0.60 [95% CI, 0.39 to 0.94] RD, -10% [95% CI, -18% to -2%]). Among adults with type 2 diabetes, there were no significant differences in the associations between any of 9 available classes of glucose-lowering drugs (alone or in combination) and the risk of cardiovascular or all-cause mortality. Metformin was associated with lower or no significant difference in HbA1C levels compared with any other drug classes. All drugs were estimated to be effective when added to metformin. These findings are consistent with American Diabetes Association recommendations for using metformin monotherapy as initial treatment for patients with type 2 diabetes and selection of additional therapies based on patient-specific considerations.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 27-08-2019
DOI: 10.2215/CJN.03200319
Abstract: The burden of infectious disease is high among kidney transplant recipients because of concomitant immunosuppression. In this study the incidence of infectious-related mortality and associated factors were evaluated. In this registry-based retrospective, longitudinal cohort study, recipients of a first kidney transplant in Australia and New Zealand between 1997 and 2015 were included. Cumulative incidence of infectious-related mortality was estimated using competing risk regression (using noninfectious mortality as a competing risk event), and compared with age-matched, populated-based data using standardized incidence ratios. Among 12,519 patients, (median age 46 years, 63% men, 15% diabetic, 6% Indigenous ethnicity), 2197 (18%) died, of whom 416 (19%) died from infection. The incidence of infection-related mortality during the study period (1997–2015) was 45.8 (95% confidence interval [95% CI], 41.6 to 50.4) per 10,000 patient-years. The incidence of infection-related mortality reduced from 53.1 (95% CI, 45.0 to 62.5) per 10,000 person-years in 1997–2000 to 43.9 (95% CI, 32.5 to 59.1) per 10,000 person-years in 2011–2015 ( P .001) Compared with the age-matched general population, kidney transplant recipients had a markedly higher risk of infectious-related death (standardized incidence ratio, 7.8 95% CI, 7.1 to 8.6). Infectious mortality was associated with older age (≥60 years adjusted subdistribution hazard ratio [SHR], 4.16 95% CI, 2.15 to 8.05 reference 20–30 years), female sex (SHR, 1.62 95% CI, 1.19 to 2.29), Indigenous ethnicity (SHR, 2.87 95% CI, 1.84 to 4.46 reference white), earlier transplant era (2011–2015: SHR, 0.39 95% CI, 0.20 to 0.76 reference 1997–2000), and use of T cell–depleting therapy (SHR, 2.43 95% CI, 1.36 to 4.33). Live donor transplantation was associated with lower risk of infection-related mortality (SHR, 0.53 95% CI, 0.37 to 0.76). Infection-related mortality in kidney transplant recipients is significantly higher than the general population, but has reduced over time. Risk factors include older age, female sex, Indigenous ethnicity, T cell–depleting therapy, and deceased donor transplantation. This article contains a podcast at edia odcast/CJASN/2019_08_27_CJN03200319.mp3
Publisher: Bentham Science Publishers Ltd.
Date: 02-2009
DOI: 10.2174/138920009787522205
Abstract: Mycophenolate mofetil (MMF) is the preferred antimetabolite in solid organ transplantation. It is a prodrug that undergoes pre-systemic metabolism to mycophenolic acid (MPA), the active drug moiety. MMF is typically administered as a fixed dose without routine monitoring of MPA concentrations. However, a role for therapeutic drug monitoring (TDM) of MPA has been suggested based on the drug's narrow therapeutic window and considerable between-subject variability. Dose-normalized MPA area under the concentration-time curve (AUC) has been observed to vary >/=10-fold. Some of this variability may be accounted for by patient variability in renal and liver function, serum albumin and haemoglobin levels, body mass, concomitant medication exposure and genetic polymorphisms in enzymes responsible for drug metabolism and transport, but much is unexplained. Widespread adoption of MPA TDM has been limited by the impracticality of full 0 to 12 hour AUC measurement (AUC(0-12)), poor correlation between pre-dose MPA concentration and AUC(0-12), ongoing questions regarding the utility of free versus total MPA measurements and lack of evidence correlating MPA exposure with clinical outcomes. Two recent randomized studies evaluating the role of MPA TDM in renal transplant recipients have reported conflicting results. Promising areas of ongoing study include use of Bayesian forecasting to predict MPA dosage and measurement of inosine monophosphate dehydrogenase activity. This review provides an overview of the pharmacokinetics of MMF in solid organ transplantation, and discusses the benefits and limitations of MPA monitoring. Areas that require additional research are identified.
Publisher: Oxford University Press (OUP)
Date: 28-02-2011
DOI: 10.1093/NDT/GFQ861
Abstract: Scleroderma is an uncommon cause of end-stage kidney disease (ESKD) which carries significant morbidity and mortality risks. The aim of this study was to determine the prevalence, treatment and outcomes of scleroderma patients with ESKD. A study was conducted of all ESKD patients enrolled in the ANZDATA registry, who commenced dialysis between 15 May 1963 and 31 December 2005, and remained on dialysis for at least 90 days. Of the 40 238 patients who commenced dialysis during the study period, 127 (0.3%) patients had ESKD secondary to scleroderma. Scleroderma ESKD patients were more likely than other ESKD patients to be female (72% versus 43%, P < 0.001), Caucasian (98% versus 79%, P < 0.001) and of lower BMI (22.7 ± 4.7 versus 26.0 ± 5.9, P < 0.001) with a higher prevalence of chronic lung disease (36 versus 14%, P < 0.001) and lower prevalence of diabetes mellitus (10% versus 32%, P < 0.001) and coronary artery disease (23% versus 35%, P = 0.01). Median survival was significantly shorter in scleroderma ESKD (2.43 years, 95% confidence interval (CI) 1.75-3.11 years) than other ESKD (6.02 years, 95% CI 5.89-6.14 years, log-rank score 55.7, P < 0.001). Renal recovery was more likely in scleroderma patients (10% versus 1%, P < 0.001) with a shorter time to recovery. Scleroderma was found to be an independent predictor for mortality (HR 2.47, 95% CI 1.99-3.05) and renal recovery (HR 11.1, 95% CI 6.37-19.4). Five year deceased donor and live donor renal allograft survival rates of recipients with scleroderma were 53 and 100%, respectively. Scleroderma is an uncommon cause of ESKD, which is associated with increased risks of both spontaneous renal recovery and mortality.
Publisher: Wiley
Date: 31-05-2019
Publisher: Springer Science and Business Media LLC
Date: 02-11-2012
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 15-06-2003
Publisher: Elsevier BV
Date: 10-2017
Publisher: Wiley
Date: 18-01-2004
Publisher: Elsevier BV
Date: 09-2018
Publisher: Elsevier BV
Date: 05-2001
Publisher: Elsevier BV
Date: 11-2002
Publisher: Wiley
Date: 04-1997
Publisher: Elsevier BV
Date: 07-2020
DOI: 10.1053/J.AJKD.2019.09.016
Abstract: Peritoneal dialysis (PD)-related peritonitis carries high morbidity for PD patients. Understanding the characteristics and risk factors for peritonitis can guide regional development of prevention strategies. We describe peritonitis rates and the associations of selected facility practices with peritonitis risk among countries participating in the Peritoneal Dialysis Outcomes and Practice Patterns Study (PDOPPS). Observational prospective cohort study. 7,051 adult PD patients in 209 facilities across 7 countries (Australia, New Zealand, Canada, Japan, Thailand, United Kingdom, United States). Facility characteristics (census count, facility age, nurse to patient ratio) and selected facility practices (use of automated PD, use of icodextrin or biocompatible PD solutions, antibiotic prophylaxis strategies, duration of PD training). Peritonitis rate (by country, overall and variation across facilities), microbiology patterns. Poisson rate estimation, proportional rate models adjusted for selected patient case-mix variables. 2,272 peritonitis episodes were identified in 7,051 patients (crude rate, 0.28 episodes atient-year). Facility peritonitis rates were variable within each country and exceeded 0.50 atient-year in 10% of facilities. Overall peritonitis rates, in episodes per patient-year, were 0.40 (95% CI, 0.36-0.46) in Thailand, 0.38 (95% CI, 0.32-0.46) in the United Kingdom, 0.35 (95% CI, 0.30-0.40) in Australia/New Zealand, 0.29 (95% CI, 0.26-0.32) in Canada, 0.27 (95% CI, 0.25-0.30) in Japan, and 0.26 (95% CI, 0.24-0.27) in the United States. The microbiology of peritonitis was similar across countries, except in Thailand, where Gram-negative infections and culture-negative peritonitis were more common. Facility size was positively associated with risk for peritonitis in Japan (rate ratio [RR] per 10 patients, 1.07 95% CI, 1.04-1.09). Lower peritonitis risk was observed in facilities that had higher automated PD use (RR per 10 percentage points greater, 0.95 95% CI, 0.91-1.00), facilities that used antibiotics at catheter insertion (RR, 0.83 95% CI, 0.69-0.99), and facilities with PD training duration of 6 or more (vs <6) days (RR, 0.81 95% CI, 0.68-0.96). Lower peritonitis risk was seen in facilities that used topical exit-site mupirocin or aminoglycoside ointment, but this association did not achieve conventional levels of statistical significance (RR, 0.79 95% CI, 0.62-1.01). S ling variation, selection bias (rate estimates), and residual confounding (associations). Important international differences exist in the risk for peritonitis that may result from varied and potentially modifiable treatment practices. These findings may inform future guidelines in potentially setting lower maximally acceptable peritonitis rates.
Publisher: Elsevier BV
Date: 03-2020
DOI: 10.1053/J.AJKD.2019.09.017
Abstract: Outcomes reported in randomized controlled trials in peritoneal dialysis (PD) are erse, are measured inconsistently, and may not be important to patients, families, and clinicians. The Standardized Outcomes in Nephrology-Peritoneal Dialysis (SONG-PD) initiative aims to establish a core outcome set for trials in PD based on the shared priorities of all stakeholders. We convened an international SONG-PD stakeholder consensus workshop in May 2018 in Vancouver, Canada. Nineteen patients/caregivers and 51 health professionals attended. Participants discussed core outcome domains and implementation in trials in PD. Four themes relating to the formation of core outcome domains were identified: life participation as a main goal of PD, impact of fatigue, empowerment for preparation and planning, and separation of contributing factors from core factors. Considerations for implementation were identified: standardizing patient-reported outcomes, requiring a validated and feasible measure, simplicity of binary outcomes, responsiveness to interventions, and using positive terminology. All stakeholders supported inclusion of PD-related infection, cardiovascular disease, mortality, technique survival, and life participation as the core outcome domains for PD.
Publisher: No publisher found
Date: 2011
Publisher: Springer Science and Business Media LLC
Date: 13-07-2017
DOI: 10.1007/S00467-017-3728-Y
Abstract: Advances in the care of children mean that adolescents with chronic kidney disease (CKD) are surviving to adulthood and requiring transition to adult care. The transition phase is well-recognised to be associated with considerable excess morbidity and graft loss, but these outcomes may be avoidable through a structured transition programme. This review will discuss the (1) challenges encountered by patients with CKD, caregivers and clinicians during transition (2) predictors and outcomes of transition (3) current guidelines on transition from paediatric to adult renal services (4) interventions and research directions that may help to improve the care and outcomes for young people with CKD in transition. In spite of the substantial improvement in health gains required for this disadvantaged population, there is to date only limited evidence on the effects of current transition programmes.
Publisher: SAGE Publications
Date: 11-2009
Publisher: Oxford University Press (OUP)
Date: 14-12-2019
DOI: 10.1093/NDT/GFY340
Abstract: A number of peritoneal dialysis (PD) systems are available but there have been few studies comparing them. The aim of this study was to examine technique failure and patient survival between different PD company systems. The study included all patients who commenced PD between 1995 and 2014 in Australia and New Zealand. Groups were compared according to the initial PD company system that they received. The primary outcome was a composite of PD technique failure and death. A total of 16 575 patients commenced PD using systems manufactured by Baxter [n = 13 438 (81%)], Fresenius Medical Care [n = 2848 (17%)] or Gambro [n = 289 (2%)]. Of these, 11 870 (72%) developed technique failure, including 5421 (33%) who died. The median time to technique failure or death for all patients was 625 [interquartile range (IQR) 318-1114] days: 629.5 (IQR 321-1121) days with Baxter, 620.5 (IQR 311-1069) days with Fresenius Medical Care and 538 (IQR 272-1001) days with Gambro systems (P = 0.023). There was a statistically significant increase in technique failure or mortality rates in patients on Gambro {adjusted incidence rate ratio [IRR] 1.46 [95% confidence interval (CI) 1.33-1.62]} and Fresenius [adjusted IRR 1.10 (95% CI 1.01-1.19)] systems compared with Baxter systems. No difference in patient survival was observed between the three PD systems. PD systems manufactured by different companies may be associated with important differences in PD technique survival. This needs to be confirmed with adequately powered, prospective randomized controlled clinical trials.
Publisher: Wiley
Date: 15-02-2006
Publisher: Springer Science and Business Media LLC
Date: 04-04-2014
Publisher: Elsevier BV
Date: 09-2020
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 12-2005
DOI: 10.1097/01.TP.0000183895.88572.13
Abstract: Uric acid (UA) may play a pathogenetic role in hypertension and kidney disease. We explored the prevalence of hyperuricemia and the relationship of UA to graft function and hypertension in prevalent renal transplant recipients (RTR). Baseline and follow-up data were collected on 90 RTR (mean age 51 yrs, 53% male, median transplant duration 7 years). Graft function was estimated using MDRD Study Equation 7. At baseline, 70% RTR had hyperuricemia (UA >7.0 mg/dl (0.42 mmol/L) in men and >6.0 mg/dl (0.36 mmol/L) in women) compared to 80% after 2.2 years (P=0.06). UA was not associated with blood pressure (BP) level but was higher in RTR with a history of hypertension compared to those without (8.6+/-1.8 vs. 7.3+/-2.2 mg/dl, [0.51+/-0.11 vs. 0.43+/-0.13 mmol/L], P=0.003) and in RTR on > or =3 antihypertensive medications compared to those taking less (9.1+/-1.6 vs. 7.6+/-1.8 mg/dL, [0.54+/-0.1 vs. 0.45+/-0.11 mmol/L], P<0.001). A history of hypertension was independently predictive of UA (beta 0.06, [95% CI 0.02 to 0.10], P=0.007) in addition to sex, cyclosporine dose, prednisolone dose, estimated glomerular filtration rate (eGFRMDRD) and beta-blocker therapy. UA was independently predictive of follow-up eGFRMDRD (beta -22.2 [95% CI -41.2 to -3.2], P=0.02) but did not predict change in eGFRMDRD over time. UA was independently associated with requirement for antihypertensive therapy (beta 0.34, [95% CI 1.05 to 1.90], P=0.02). Hyperuricemia is common in RTR and is associated with need for antihypertensive therapy and level of graft function.
Publisher: Wiley
Date: 02-2017
DOI: 10.1111/NEP.12933
Abstract: Disorders in the regulation of the alternate complement pathway often result in complement-mediated damage to the microvascular endothelium and can be associated with both glomerulonephritis and atypical haemolytic uraemic syndrome. Inherited defects in complement regulatory genes or autoantibodies against complement regulatory proteins are predictive of the severity of the disease and the risk of recurrence post kidney transplantation. Heterozygous mutations in CD46, which codes for a transmembrane cofactor glycoprotein membrane cofactor protein, usually have a lower incidence of end-stage kidney disease and decreased risk of recurrent disease post transplant, as wild-type membrane cofactor protein is present in the transplanted kidney. However, some patients with CD46 mutations have a second variant in other complement regulatory genes increasing the severity of disease. The following case report illustrates the course of a young adult patient with end-stage kidney disease initially ascribed to seronegative systemic lupus erythematosus, who presented with biopsy-proven thrombotic microangiopathy following kidney transplantation. It highlights the complexity associated with disorders of complement regulation and the need for a high index of suspicion and genetic testing in patients who present with thrombotic microangiopathy post-transplant.
Publisher: SAGE Publications
Date: 11-2009
DOI: 10.1177/089686080902900609
Abstract: The contribution of peritoneal small solute clearance per se to peritoneal dialysis (PD) patient outcomes remains uncertain. The aim of the present study was to determine whether baseline peritoneal small solute clearance predicted subsequent survival in Australian and New Zealand PD patients. The study included all adult patients in Australia and New Zealand that commenced PD between 1 April 2002 and 31 December 2005 and had a peritoneal Kt/V (pKt/V) measurement performed within 6 months of PD commencement. Time to death and death-censored technique failure were examined by Kaplan–Meier analyses and both univariate and multivariate Cox proportional hazards models. pKt/V measurements were available in 2434 (63%) of the 3841 in iduals that began PD treatment in Australia and New Zealand during the study period. These patients were ided into 4 groups according to their baseline pKt/V values: .45 ( n = 599), 1.45 – 1.69 ( n = 550), 1.70 – 2.00 ( n = 607), and .00 ( n = 678). Compared with the reference group (pKt/V 1.70 – 2.00), patient mortality was significantly increased in in iduals with pKt/V .45 [adjusted hazard ratio (HR) 1.87, 95% confidence interval (CI) 1.24 – 2.84 p = 0.003] and tended to be increased in those with pKt/V 1.45 – 1.69 (adjusted HR 1.46, 95% CI 0.96 – 2.21 p = 0.074). Importantly, higher pKt/V values ( .00) also tended to be associated with higher mortality (adjusted HR 1.42, 95% CI 0.96 – 2.11 p = 0.079). The other independent predictors of death were lower residual renal function (RRF), older age, peripheral vascular disease, diabetes mellitus, late referral, higher peritoneal permeability, and untreated hypertension. No interaction was observed between pKt/V, RRF, and survival. Death-censored technique failure was demonstrated to be significantly worse in the pKt/V 1.45 – 1.69 group (adjusted HR 1.36, 95% CI 1.03 – 1.79 p = 0.028), older in iduals, and in iduals with Asian racial origin. Initial peritoneal Kt/V significantly and independently influences patient survival in Australian and New Zealand PD patients. Overall survival appears to be optimal in the pKt/V range 1.70 – 2.00, with poorer outcomes observed above and below these values. In particular, survival is significantly worse when the achieved pKt/V is .45. In addition, RRF is an important independent predictor of patient survival in the Australian and New Zealand incident PD patient populations. The results of this study should therefore draw attention to the possible danger of not delivering adequate PD dose to patients with considerable RRF.
Publisher: SAGE Publications
Date: 20-08-2021
Abstract: Beta-trace protein (BTP) is a novel marker for residual kidney function (RKF) without need for urinary collection. We aimed to examine its utility as a tool for estimating RKF in incident peritoneal dialysis (PD) patients. This was a post hoc analysis of incident PD patients from the balANZ trial cohort. The outcomes evaluated were trends of serum BTP concentration with time, factors associated with change in BTP using mixed-effect multilevel linear regression and correlation of BTP with mean urinary urea and creatinine clearances (measured glomerular filtration rate (GFR)). Performances of two BTP-derived equations (Shafi-Eqn and Steubl-Eqn) to estimate GFR were evaluated by reporting bias (median difference between estimated and measured GFR), precision (interquartile range of median bias), accuracy (±2 mL/min of measured GFR) and P30 (percentage estimates within 30% of measured GFR) with confidence intervals (CIs) generated by bootstrapping 2000 replicates. The agreement between BTP-estimated GFR and measured GFR was also plotted graphically on Bland–Altman analysis. The study included 161 PD patients. BTP concentration increased with dialysis vintage and was inversely correlated with measured GFR ( r = −0.64). Larger increases in BTP were associated with longer PD vintage and higher dialysate glucose exposure. Biases of BTP-estimated GFRs (Shafi-Eqn and Steubl-Eqn) were 1.2 mL/min/1.73 m 2 (95% CI 1.0–1.3 mL/min/1.73 m 2 ) and 0.4 mL/min/1.73 m 2 (95% CI 0.2–0.6 mL/min/1.73 m 2 ), respectively. Both BTP-estimated GFRs had poor precision (3.2 mL/min/1.73 m 2 (95% CI 2.9–3.5 mL/min/1.73 m 2 ) and 2.8 mL/min/1.73 m 2 (95% CI 2.5–3.2 mL/min/1.73 m 2 ), respectively) and accuracy of estimates (55% (95% CI 52–60%) and 59% (95% CI 55–63%), respectively). The mean difference of BTP-estimated GFR (Shafi-Eqn and Steubl-Eqn) and measured GFR were −1.14 mL/min/1.73 m 2 and −0.42 mL/min/1.73 m 2 , respectively, with large limit of agreement on Bland–Altman plot. Serum BTP level was inversely related to RKF but neither BTP-estimated GFR equations were sufficiently accurate for routine use in PD patients.
Publisher: Hindawi Limited
Date: 2017
DOI: 10.1155/2017/9096435
Abstract: The immunosuppressant tacrolimus has a narrow therapeutic window, necessitating therapeutic drug monitoring to maintain efficacy and minimise toxicity. There are very few reports examining the impact of impaired biliary excretion on tacrolimus blood levels or toxicity. We report the case of a 26-year-old combined liver and kidney transplant recipient, who developed acute biliary obstruction leading to tacrolimus toxicity with very high blood tacrolimus levels. Despite a careful evaluation, no alternative cause was found for her acute kidney injury, and her kidney function returned to previous baseline within several days following treatment of the biliary obstruction and temporary withdrawal of tacrolimus.
Publisher: Elsevier BV
Date: 08-2015
DOI: 10.1053/J.AJKD.2015.02.341
Abstract: Research aims to improve health outcomes for patients. However, the setting of research priorities is usually performed by clinicians, academics, and funders, with little involvement of patients or caregivers and using processes that lack transparency. A national workshop was convened in Australia to generate and prioritize research questions in chronic kidney disease (CKD) among erse stakeholder groups. Patients with CKD (n=23), nephrologists/surgeons (n=16), nurses (n=8), caregivers (n=7), and allied health professionals and researchers (n=4) generated and voted on intervention questions across 4 treatment categories: CKD stages 1 to 5 (non-dialysis dependent), peritoneal dialysis, hemodialysis, and kidney transplantation. The 5 highest ranking questions (in descending order) were as follows: How effective are lifestyle programs for preventing deteriorating kidney function in early CKD? What strategies will improve family consent for deceased donor kidney donation, taking different cultural groups into account? What interventions can improve long-term post-transplant outcomes? What are effective interventions for post hemodialysis fatigue? How can we improve and in idualize drug therapy to control post-transplant side effects? Priority questions were focused on prevention, lifestyle, quality of life, and long-term impact. These prioritized research questions can inform funding agencies, patient/consumer organizations, policy makers, and researchers in developing a CKD research agenda that is relevant to key stakeholders.
Publisher: Wiley
Date: 15-06-2017
DOI: 10.1111/NEP.12815
Abstract: Pentoxifylline has been shown to increase haemoglobin levels in patients with chronic kidney disease (CKD) and erythropoietin-stimulating agent (ESA)-hyporesponsive anaemia in the Handling Erythropoietin Resistance with Oxpentifylline multicentre double-blind, randomized controlled trial. The present sub-study evaluated the effects of pentoxifylline on the iron-regulatory hormone hepcidin in patients with ESA-hyporesponsive CKD. This sub-study included 13 patients in the pentoxifylline arm (400 mg daily) and 13 in the matched placebo arm. Hepcidin-25 was measured by ultra performance liquid chromatography/quadrupole time-of-flight mass spectrometry following isolation from patient serum. Serum hepcidin-25, serum iron biomarkers, haemoglobin and ESA dosage were compared within and between the two groups. Hepcidin-25 concentration at 4 months adjusted for baseline did not differ significantly in pentoxifylline versus placebo treated patients (adjusted mean difference (MD) -7.9 nmol, P = 0.114), although the difference between the groups mean translated into a >25% reduction of circulating hepcidin-25 due to pentoxifylline compared with the placebo baseline. In paired analysis, serum hepcidin-25 levels were significantly decreased at 4 months compared with baseline in the pentoxifylline group (-5.47 ± 2.27 nmol/l, P < 0.05) but not in the placebo group (2.82 ± 4.29 nmol/l, P = 0.24). Pentoxifylline did not significantly alter serum ferritin (MD 55.4 mcg/l), transferrin saturation (MD 4.04%), the dosage of ESA (MD -9.93 U/kg per week) or haemoglobin concentration (MD 5.75 g/l). The reduction of circulating hepcidin-25 due to pentoxifylline did not reach statistical significance however, the magnitude of the difference suggests that pentoxifylline may be a clinically and biologically meaningful modulator of hepcidin-25 in dialysis of patients with ESA-hyporesponsive anaemia.
Publisher: SAGE Publications
Date: 2015
DOI: 10.1186/S40697-015-0066-5
Abstract: Erythropoiesis stimulating agent (ESA)-resistant anemia is common in chronic kidney disease (CKD). To evaluate the determinants of severity of ESA resistance in patients with CKD and primary ESA-resistance. Secondary analysis of a randomized controlled trial (the Handling Erythropoietin Resistance with Oxpentifylline, HERO) 53 adult patients with CKD stage 4 or 5 and primary ESA-resistant anemia (hemoglobin ≤120 g/L, ESA resistance index [ERI] ≥1.0 IU/kg/week/gHb for erythropoietin or ≥0.005 μg/kg/week/gHb for darbepoeitin, no cause for ESA-resistance identified). Iron studies, parathyroid hormone, albumin, liver enzymes, phosphate or markers of oxidative stress and inflammation. Participants were ided into tertiles of ERI. Multinomial logistic regression was used to analyse the determinants of ERI tertiles. All patients, except one, were receiving dialysis for end-stage kidney disease. The mean ± SD ERI values in the low ( n = 18), medium ( n = 18) and high ( n = 17) ERI tertiles were 1.4 ± 0.3, 2.3 ± 0.2 and 3.5 ± 0.8 IU/kg/week/gHb, respectively ( P 0.001). There were no significant differences observed in age, gender, ethnicity, cause of kidney disease, diabetes, iron studies, parathyroid hormone, albumin, liver enzymes, phosphate or markers of oxidative stress and inflammation between the ERI tertiles. The median [inter-quartile range] serum alkaline phosphatase concentrations in the low, medium and high ERI tertiles were 89 [64,121], 99 [76,134 and 148 [87,175] U/L, respectively ( P = 0.054). There was a weak but statistically significant association between ERI and serum alkaline phosphatase (R 2 = 0.06, P = 0.03). Using multinomial logistic regression, the risk of being in the high ERI tertile relative to the low ERI tertile increased with increasing serum alkaline phosphatase levels ( P = 0.02). No other variables were significantly associated with ERI. Small s le size bone-specific alkaline phosphatase, other markers of bone turnover and bone biopsies not evaluated. Serum alkaline phosphatase was associated with severity of ESA resistance in ESA-resistant patients with CKD. Large prospective studies are required to confirm this association. (Trial registration: Australian New Zealand Clinical Trials Registry 12608000199314)
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 06-2012
Publisher: Oxford University Press (OUP)
Date: 30-11-2009
DOI: 10.1093/NDT/GFP641
Abstract: Background. Enterococcal peritonitis is a serious complication of peritoneal dialysis (PD), although reports of this condition in the literature are exceedingly limited. Methods. The frequency, predictors, treatment and clinical outcomes of enterococcal peritonitis were investigated in all 4675 patients receiving PD in Australia between 1 October 2003 and 31 December 2006. Results. One hundred and sixteen episodes of enterococcal peritonitis occurred in 103 in iduals. Enterococcal peritonitis tended to be associated with older age, Maori and Pacific Islander racial origin, renovascular disease and coronary artery disease. Polymicrobial peritonitis, defined as recovery of two or more organisms from dialysate effluent, was significantly more common when an Enterococcus species was isolated than when it was not (45% vs 5%, respectively, P < 0.001, odds ratio 13.4, 95% CI 9.45-19.0). Although international guidelines recommend intraperitoneal icillin therapy, only 8% of patients with pure enterococcal peritonitis were treated with this agent, whilst the majority (78%) received vancomycin monotherapy. Overall, 59 (51%) patients with enterococcal peritonitis were successfully treated with antibiotics without experiencing relapse, catheter removal or death. The sole independent predictor of adverse clinical outcomes was recovery of additional (non-Enterococcus) organisms. Polymicrobial enterococcal peritonitis was associated with very high rates of hospitalization (83%), catheter removal (52%), permanent haemodialysis transfer (50%) and death (5.8%). In contrast, clinical outcomes were broadly comparable for pure enterococcal and non-enterococcal peritonitis (hospitalization 75% vs 69%, respectively catheter removal 25% vs 21% permanent haemodialysis transfer 17% vs 17% death 1.6% vs 2.2%) although worse than non-enterococcal Gram-positive peritonitis (63%, 12%, 3% and 0.6%, respectively). Removal of the PD catheter within 1 week of enterococcal peritonitis onset was associated with a lower probability of permanent haemodialysis transfer than later removal (74% vs 100%, P = 0.03). Enterococcal peritonitis is associated with an increased risk of catheter removal, permanent haemodialysis transfer and death, particularly when other organisms are isolated in the same episode.
Publisher: Elsevier BV
Date: 08-2012
Publisher: Springer Science and Business Media LLC
Date: 19-10-2012
Publisher: BMJ
Date: 07-2017
Publisher: SAGE Publications
Date: 03-2019
Abstract: The incidence of elderly patients receiving peritoneal dialysis (PD) has increased. This study aimed to examine the clinical presentation and outcomes of peritonitis in elderly PD patients compared with younger PD patients. This single-center, retrospective, observational cohort study included all adult PD patients who developed peritonitis between January 2011 and December 2014. Elderly was defined as ≥ 65 years old at PD initiation. The primary outcome was medical cure, defined as a peritonitis episode cured by antibiotics without being complicated by catheter removal, transfer to hemodialysis (HD), relapsing peritonitis, or death. The secondary outcomes were clinical manifestations (fever, cloudy dialysate) and complications (catheter removal, transfer to HD, relapse, hospitalization, and mortality). Peritonitis outcomes were compared using multivariable logistic regression. Overall, 377 peritonitis episodes occurred in 247 patients. Of these, 126 episodes occurred in 79 elderly patients and 251 episodes occurred in 168 younger patients. Baseline demographic data were comparable between the 2 groups, except that elderly patients were significantly more likely to have diabetes mellitus (66% vs 46%), diabetic nephropathy (55% vs 39%), and a lower serum albumin than younger patients. Medical cure was comparable between the 2 groups (71% vs 72%, respectively, p = 0.67, adjusted odds ratio [AOR] 0.89, 95% confidence interval [CI]: 0.52 – 1.53). Compared with younger patients, elderly patients experiencing peritonitis had lower odds of fever (OR 0.53, 95% CI: 0.30 – 0.94), cloudy dialysate (OR 0.45, 95% CI: 0.23 – 0.88), and catheter removal (AOR 0.50, 95% CI: 0.26 – 0.98), but similar odds of transfer to HD (AOR 0.70, 95% CI: 0.32 – 1.51), relapse (AOR 1.57, 95% CI: 0.46 – 5.40), hospitalization (AOR 1.55, 95% CI: 0.52 – 4.56), and all-cause mortality (AOR 1.88, 95% CI: 0.83 – 4.26). Compared with younger patients, elderly PD patients with peritonitis achieved similar medical cure rates, a lower catheter removal rate, and comparable rates of HD transfer, relapse, hospitalization, and death. Elderly PD patients experiencing peritonitis were less likely to present with fever or cloudy dialysate.
Publisher: Elsevier BV
Date: 05-2006
Abstract: Administration of human recombinant erythropoietin (EPO) at time of acute ischemic renal injury (IRI) inhibits apoptosis, enhances tubular epithelial regeneration, and promotes renal functional recovery. The present study aimed to determine whether darbepoetin-alfa (DPO) exhibits comparable renoprotection to that afforded by EPO, whether pro or antiapoptotic Bcl-2 proteins are involved, and whether delayed administration of EPO or DPO 6 h following IRI ameliorates renal dysfunction. The model of IRI involved bilateral renal artery occlusion for 45 min in rats (N = 4 per group), followed by reperfusion for 1-7 days. Controls were sham-operated. Rats were treated at time of ischemia or sham operation (T0), or post-treated (6 h after the onset of reperfusion, T6) with EPO (5000 IU/kg), DPO (25 mug/kg), or appropriate vehicle by intraperitoneal injection. Renal function, structure, and immunohistochemistry for Bcl-2, Bcl-XL, and Bax were analyzed. DPO or EPO at T0 significantly abrogated renal dysfunction in IRI animals (serum creatinine for IRI 0.17 +/- 0.05 mmol/l vs DPO-IRI 0.08 +/- 0.03 mmol/l vs EPO-IRI 0.04 +/- 0.01 mmol/l, P = 0.01). Delayed administration of DPO or EPO (T6) also significantly abrogated subsequent renal dysfunction (serum creatinine for IRI 0.17 +/- 0.05 mmol/l vs DPO-IRI 0.06 +/- 0.01 mmol/l vs EPO-IRI 0.03 +/- 0.03 mmol/l, P = 0.01). There was also significantly decreased tissue injury (apoptosis, P < 0.05), decreased proapoptotic Bax, and increased regenerative capacity, especially in the outer stripe of the outer medulla, with DPO or EPO at T0 or T6. These results reaffirm the potential clinical application of DPO and EPO as novel renoprotective agents for patients at risk of ischemic acute renal failure or after having sustained an ischemic renal insult.
Publisher: American Physiological Society
Date: 05-2018
DOI: 10.1152/AJPRENAL.00057.2017
Abstract: Oxidative stress and mitochondrial dysfunction exacerbate acute kidney injury (AKI), but their role in any associated progress to chronic kidney disease (CKD) remains unclear. Antioxidant therapies often benefit AKI, but their benefits in CKD are controversial since clinical and preclinical investigations often conflict. Here we examined the influence of the antioxidant N-acetyl-cysteine (NAC) on oxidative stress and mitochondrial function during AKI (20-min bilateral renal ischemia plus reperfusion/IR) and progression to chronic kidney pathologies in mice. NAC (5% in diet) was given to mice 7 days prior and up to 21 days post-IR (21d-IR). NAC treatment resulted in the following: prevented proximal tubular epithelial cell apoptosis at early IR (40-min postischemia), yet enhanced interstitial cell proliferation at 21d-IR increased transforming growth factor-β1 expression independent of IR time and significantly d ened nuclear factor-like 2-initiated cytoprotective signaling at early IR. In the long term, NAC enhanced cellular metabolic impairment demonstrated by increased peroxisome proliferator activator-γ serine-112 phosphorylation at 21d-IR. Intravital multiphoton microscopy revealed increased endogenous fluorescence of nicotinamide adenine dinucleotide (NADH) in cortical tubular epithelial cells during ischemia, and at 21d-IR that was not attenuated with NAC. Fluorescence lifetime imaging microscopy demonstrated persistent metabolic impairment by increased free/bound NADH in the cortex at 21d-IR that was enhanced by NAC. Increased mitochondrial dysfunction in remnant tubular cells was demonstrated at 21d-IR by tetramethylrhodamine methyl ester fluorimetry. In summary, NAC enhanced progression to CKD following AKI not only by d ening endogenous cellular antioxidant responses at time of injury but also by enhancing persistent kidney mitochondrial and metabolic dysfunction.
Publisher: Oxford University Press (OUP)
Date: 27-07-2018
DOI: 10.1093/NDT/GFY216
Abstract: Infections are common and can be fatal in patients undergoing long-term dialysis. Recent studies have shown conflicting evidence associating infection with vitamin D status or use of vitamin D and have not been systematically reviewed in this population. We searched PubMed, Web of Science, Cochrane Library, Embase and three Chinese databases from inception until December 2017 for interventional [non-randomized or randomized controlled trials (RCTs)], cohort and case-control studies on levels of serum 25-hydroxyvitamin D [25(OH)D] or use of vitamin D [supplemental nutritional vitamin D or vitamin D receptor activator (VDRA)] and infection (any infection, infection-required hospitalization or infection-related death or composite) in long-term dialysis patients. We conducted a meta-analysis on the relative risk (RR) of infection and level of 25(OH)D or use of vitamin D. Of 2440 reports identified, 17 studies met inclusion criteria, all with moderate quality, with 6 cohort studies evaluating 25(OH)D serum concentrations (n = 5714) and 11 (2 RCTs and 9 observational studies) evaluating the use of vitamin D (n = 92 309). The risk of composite infection was 39% lower {relative risk [RR] 0.61 [95% confidence interval (CI) 0.41-0.89]} in the subjects with high or normal levels of 25(OH)D than in those with low levels. When compared with those who did not use vitamin D, the pooled adjusted risk for composite infection was 41% lower in those who used vitamin D [RR 0.59 (95% CI 0.43-0.81)]. High or normal serum levels of 25(OH)D and the use of vitamin D, particularly VDRA, were each associated with a lower risk of composite infection in long-term dialysis patients.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 12-2003
Publisher: Wiley
Date: 05-12-2009
Publisher: Wiley
Date: 05-12-2009
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 11-2013
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 11-2011
DOI: 10.2215/CJN.02200311
Abstract: Hepatitis C virus (HCV) infection is associated with increased mortality and morbidity in end-stage renal failure (ESRF) patients. Despite a lower incidence and risk of transmission of HCV infection with peritoneal dialysis (PD), the optimal dialysis modality for HCV-infected ESRF patients is not known. The aim of this study was to evaluate the impact of dialysis modality on the survival of HCV-infected ESRF patients. The study included all adult incident ESRF patients in Australia and New Zealand who commenced dialysis between January 1, 1994, and December 31, 2008, and were HCV antibody-positive at the time of dialysis commencement. Time to all-cause mortality was compared between hemodialysis (HD) and PD according to modality assignment at day 90, using Cox proportional hazards model analysis. A total of 424 HCV-infected ESRF patients commenced dialysis during the study period and survived for at least 90 days (PD n = 134 HD n = 290). Mortality rates were comparable between PD and HD in the first year (10.7 versus 13.8 deaths per 100 patient-years, respectively adjusted hazard ratio [HR] 0.65, 95% CI 0.34 to 1.26) and thereafter (20 versus 15.9 deaths per 100 patient-years, respectively HR 1.27, 95% CI 0.86 to 1.88). The survival of HCV-infected ESRF patients is comparable between PD and HD.
Publisher: Oxford University Press (OUP)
Date: 17-03-2016
DOI: 10.1093/NDT/GFW030
Abstract: Successful cannulation of arteriovenous fistulas (AVFs) using a safe and effective technique that minimizes patient harm is a crucial aspect of haemodialysis treatment. Although the current standard of care for many years has been the rope-ladder technique (using sharp needles to cannulate rotating sites across the entire AVF), a number of enthusiasts have recently advocated for the alternative method of buttonhole cannulation (using blunt needles to repeatedly cannulate the same site via a healed track) on the basis of putative, as yet unproven benefits. In this article, we review all available observational studies, randomized controlled trials, and systematic reviews and meta-analyses that have compared the clinical outcomes of buttonhole and rope-ladder cannulation of AVFs. These studies clearly and consistently demonstrated that buttonhole cannulation causes significant and serious infectious harm to haemodialysis patients, especially in the home setting. No strategies or treatments have been proven to effectively mitigate this hazard of buttonhole cannulation. Moreover, buttonhole cannulation is associated with a higher rate of abandonment and has not been shown to have any proven benefit compared with the rope-ladder method. Specifically, buttonhole cannulation has not been shown to reduce cannulation-related pain, improve vascular access survival, reduce vascular access interventions, reduce haematoma formation, improve haemostasis or reduce aneurysm formation. Consequently, rope-ladder cannulation should remain the standard of care and buttonhole cannulation should only be used in rare circumstances (e.g. short segment AVFs where the only alternative is a haemodialysis catheter).
Publisher: Elsevier BV
Date: 07-2016
DOI: 10.1016/J.NIOX.2016.05.002
Abstract: Chronic kidney disease (CKD) is associated with an increased risk of death from cardiovascular disease (CVD). One factor involved in CVD development is nitric oxide (NO), which acts as a powerful vasodilator. NO is produced via the nitrogen cycle, through the reduction of nitrate to nitrite with the process mainly occurring in the mouth by commensal microbiota. People with CKD have compromised microbiota (dysbiosis) with an increased abundance of potentially pathogenic and pro-inflammatory bacteria capable of producing uremic toxins that contribute to CKD development and reduce enzymatic NO production. However, to date, few studies have comprehensively documented the gut or saliva microbiota in the CKD population or investigated the role of NO in people with CKD. This review will discuss NO pathways that are linked to the progression of CKD and CVD and therapeutic options for targeting these pathways.
Publisher: Wiley
Date: 17-09-2017
DOI: 10.1111/NEP.12823
Publisher: AMPCo
Date: 04-2003
Publisher: BMJ
Date: 31-10-2019
DOI: 10.1136/BMJ.L5873
Abstract: To determine the global capacity (availability, accessibility, quality, and affordability) to deliver kidney replacement therapy (dialysis and transplantation) and conservative kidney management. International cross sectional survey. International Society of Nephrology (ISN) survey of 182 countries from July to September 2018. Key stakeholders identified by ISN’s national and regional leaders. Markers of national capacity to deliver core components of kidney replacement therapy and conservative kidney management. Responses were received from 160 (87.9%) of 182 countries, comprising 97.8% (7338.5 million of 7501.3 million) of the world’s population. A wide variation was found in capacity and structures for kidney replacement therapy and conservative kidney management—namely, funding mechanisms, health workforce, service delivery, and available technologies. Information on the prevalence of treated end stage kidney disease was available in 91 (42%) of 218 countries worldwide. Estimates varied more than 800-fold from 4 to 3392 per million population. Rwanda was the only low income country to report data on the prevalence of treated disease 5 ( %) of 53 African countries reported these data. Of 159 countries, 102 (64%) provided public funding for kidney replacement therapy. Sixty eight (43%) of 159 countries charged no fees at the point of care delivery and 34 (21%) made some charge. Haemodialysis was reported as available in 156 (100%) of 156 countries, peritoneal dialysis in 119 (76%) of 156 countries, and kidney transplantation in 114 (74%) of 155 countries. Dialysis and kidney transplantation were available to more than 50% of patients in only 108 (70%) and 45 (29%) of 154 countries that offered these services, respectively. Conservative kidney management was available in 124 (81%) of 154 countries. Worldwide, the median number of nephrologists was 9.96 per million population, which varied with income level. These comprehensive data show the capacity of countries (including low income countries) to provide optimal care for patients with end stage kidney disease. They demonstrate substantial variability in the burden of such disease and capacity for kidney replacement therapy and conservative kidney management, which have implications for policy.
Publisher: SAGE Publications
Date: 07-2018
Abstract: Acinetobacter is a rare but important cause of peritonitis in peritoneal dialysis (PD) patients. As the complication has not been comprehensively evaluated previously, the present study examined the outcomes of Acinetobacter peritonitis in a large, national cohort of PD patients. The study included all episodes of peritonitis in Australia from January 2004 to December 2014 using Australia and New Zealand Dialysis and Transplant Registry (ANZDATA) data. The primary outcome was peritonitis cure and secondary outcomes were catheter removal, hemodialysis transfer, recurrent/relapsing peritonitis, peritonitis-related hospitalization, and death. Outcomes were compared using multivariable logistic regression. Overall, 5,367 patients experienced 11,122 episodes of peritonitis across 51 centers in Australia. Of these, 228 (4.2%) patients experienced 253 (2.3%) episodes of Acinetobacter peritonitis (176 episodes were due to Acinetobacter alone and 77 involved co-infection with other organisms). Of the 176 solitary Acinetobacter episodes, 131(74%) achieved cure with antibiotics alone. Compared with Acinetobacter, significantly lower odds of peritonitis cure were observed for Pseudomonas (adjusted odds ratio [AOR] 0.24, 95% confidence interval [CI]: 0.16 – 0.36), other gram-negative organisms (AOR 0.54, 95% CI 0.37 – 0.77), fungi (AOR 0.02, 95% CI 0.01 – 0.03), and polymicrobial organisms (AOR 0.36, 95% CI 0.25 – 0.51), whilst similar odds of cure were observed for Staphylococcus (AOR 0.73, 95% CI 0.50 – 1.06), other gram-positive organisms (AOR 1.32,95% CI 0.93 – 1.89), culture-negative (AOR 1.19, 95% CI 0.82 –1.71), and other organisms (AOR 0.72, 95% CI 0.49 – 1.07). The odds of catheter removal and hemodialysis transfer were higher with Pseudomonas, other gram-negative, fungal, and polymicrobial peritonitis than with Acinetobacter peritonitis. The odds of death were also higher with Pseudomonas and fungal peritonitis than with Acinetobacter peritonitis. Treatment of Acinetobacter peritonitis with gentamicin, ciprofloxacin, or ceftazidime achieved comparable outcomes. Outcomes of Acinetobacter peritonitis were favorable compared with most other forms of organism-specific peritonitis. Commonly used antibiotics covering gram-negative bacteria achieved comparable outcomes in Acinetobacter peritonitis.
Publisher: Massachusetts Medical Society
Date: 25-06-2020
Publisher: Hindawi Limited
Date: 2014
DOI: 10.1155/2014/909373
Abstract: Inflammation at both systemic and local intraperitoneal levels commonly affects peritoneal dialysis (PD) patients. Interest in inflammatory markers as targets of therapeutic intervention has been considerable as they are recognised as predictors of poor clinical outcomes. However, prior to embarking on strategies to reduce inflammatory burden, it is of paramount importance to define the underlying processes that drive the chronic active inflammatory status. The present review aims to comprehensively describe clinical causes of inflammation in PD patients to which potential future strategies may be targeted.
Publisher: Elsevier BV
Date: 10-2015
DOI: 10.1053/J.AJKD.2015.04.051
Abstract: Dental disease is more extensive in adults with chronic kidney disease, but whether dental health and behaviors are associated with survival in the setting of hemodialysis is unknown. Prospective multinational cohort. 4,205 adults treated with long-term hemodialysis, 2010 to 2012 (Oral Diseases in Hemodialysis [ORAL-D] Study). Dental health as assessed by a standardized dental examination using World Health Organization guidelines and personal oral care, including edentulousness decayed, missing, and filled teeth index teeth brushing and flossing and dental health consultation. All-cause and cardiovascular mortality at 12 months after dental assessment. Multivariable-adjusted Cox proportional hazards regression models fitted with shared frailty to account for clustering of mortality risk within countries. During a mean follow-up of 22.1 months, 942 deaths occurred, including 477 cardiovascular deaths. Edentulousness (adjusted HR, 1.29 95% CI, 1.10-1.51) and decayed, missing, or filled teeth score ≥ 14 (adjusted HR, 1.70 95% CI, 1.33-2.17) were associated with early all-cause mortality, while dental flossing, using mouthwash, brushing teeth daily, spending at least 2 minutes on oral hygiene daily, changing a toothbrush at least every 3 months, and visiting a dentist within the past 6 months (adjusted HRs of 0.52 [95% CI, 0.32-0.85], 0.79 [95% CI, 0.64-0.97], 0.76 [95% CI, 0.58-0.99], 0.84 [95% CI, 0.71-0.99], 0.79 [95% CI, 0.65-0.95], and 0.79 [95% CI, 0.65-0.96], respectively) were associated with better survival. Results for cardiovascular mortality were similar. Convenience s le of clinics. In adults treated with hemodialysis, poorer dental health was associated with early death, whereas preventive dental health practices were associated with longer survival.
Publisher: Oxford University Press (OUP)
Date: 13-08-2019
DOI: 10.1093/NDT/GFY209
Abstract: It is unclear if haemodiafiltration improves patient survival compared with standard haemodialysis. Observational studies have tended to show benefit with haemodiafiltration, while meta-analyses have not provided definitive proof of superiority. Using data from the Australia and New Zealand Dialysis and Transplant Registry, this binational inception cohort study compared all adult patients who commenced haemodialysis in Australia and New Zealand between 2000 and 2014. The primary outcome was all-cause mortality. Cardiovascular mortality was the secondary outcome. Outcomes were measured from the first haemodialysis treatment and were examined using multivariable Cox regression analyses. Patients were censored at permanent discontinuation of haemodialysis or at 31 December 2014. Analyses were stratified by country. The study included 26 961 patients (4110 haemodiafiltration, 22 851 standard haemodialysis 22 774 Australia, 4187 New Zealand) with a median follow-up of 5.31 (interquartile range 2.87-8.36) years. Median age was 62 years, 61% were male, 71% were Caucasian. Compared with standard haemodialysis, haemodiafiltration was associated with a significantly lower risk of all-cause mortality [adjusted hazard ratio (HR) for Australia 0.79, 95% confidence interval (95% CI) 0.72-0.87 adjusted HR for New Zealand 0.88, 95% CI 0.78-1.00]. In Australian patients, there was also an association between haemodiafiltration and reduced cardiovascular mortality (adjusted HR 0.78, 95% CI 0.64-0.95). Haemodiafiltration was associated with superior survival across patient subgroups of age, sex and comorbidity.
Publisher: Informa UK Limited
Date: 02-01-2016
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 27-10-2004
Publisher: Elsevier BV
Date: 07-2017
DOI: 10.1053/J.AJKD.2016.11.008
Abstract: Intraperitoneal administration of antibiotics is recommended as a first treatment for managing peritoneal dialysis (PD)-related peritonitis. However, the efficacy of oral administration of quinolones has not been well studied. Randomized controlled pilot study. 80 eligible patients with PD-related peritonitis from Peking University First Hospital (40 in each arm). Intraperitoneal vancomycin, 1g, every 5 days plus oral moxifloxacin, 400mg, every day (treatment group) versus intraperitoneal vancomycin, 1g, every 5 days plus intraperitoneal ceftazidime, 1g, every day (control group). The primary end point was complete resolution of peritonitis, and secondary end points were primary or secondary treatment failure. PD effluent white blood cell count. Baseline demographic and clinical characteristics of the 2 groups were comparable. There were 24 and 22 Gram-positive organisms, 6 and 7 Gram-negative organisms, 9 and 10 culture-negative s les, and 1 and 1 fungal s le in the treatment and control groups, respectively. Complete resolution of peritonitis was achieved in 78% and 80% of cases in the treatment and control groups, respectively (OR, 0.86 95% CI, 0.30-2.52 P=0.8). There were 3 and 1 cases of relapse in the treatment and control groups, respectively. Primary and secondary treatment failure rates were not significantly different (33% vs 20% and 10% vs 13%, respectively). In each group, there was 1 peritonitis-related death and 6 transfers to hemodialysis therapy. During the 3-month follow-up period, 7 and 3 successive episodes of peritonitis occurred in the treatment and control groups, respectively. Only 2 adverse drug reactions (mild nausea and mild rash, respectively) were observed in the 2 groups. S le size was relatively small and the eligibility ratio was low. Also, the number of peritonitis episodes was low, limiting the power to detect a difference between groups. This pilot study suggests that intraperitoneal vancomycin with oral moxifloxacin is a safe, well-tolerated, practical, and effective first-line treatment for PD-related peritonitis. Larger adequately powered clinical trials are warranted.
Publisher: S. Karger AG
Date: 2000
DOI: 10.1159/000045837
Abstract: i Background/Aims: /i Several recent studies have suggested that angiotensin-converting enzyme (ACE) inhibitors ameliorate chronic cyclosporin A (CyA) tubulo-interstitial disease by mechanisms independent of their antihypertensive effects. The aim of the present study was to determine whether ACE inhibition exerts a direct beneficial effect on the tubulo-interstitium in an in vitro model of chronic CyA nephropathy. i Methods: /i Primary cultures of human proximal tubular cells (PTC) and renal cortical fibroblasts (CF) were exposed for 24 h to CyA in the presence or absence of enalaprilat. Parameters of tubulo-interstitial nephrotoxicity were then measured including collagen synthesis (proline incorporation), tubular viability and function (thymidine incorporation, lactate dehydrogenase release, and apical sodium-hydrogen exchange), and secretion of insulin-like growth factor I, transforming growth factor beta 1 (TGFβ sub /sub ), and platelet-derived growth factor. i Results: /i CyA promoted CF collagen synthesis, PTC cytotoxicity (suppressed viability, growth and sodium transport), and tubulo-interstitial fibrogenic cytokine release (CF secretion of insulin-like growth factor I and PTC secretion of TGFβ sub /sub and platelet-derived growth factor). Enalaprilat completely reversed the stimulatory effects of CyA on CF collagen synthesis (CyA + enalaprilat 6.40 ± 0.50% vs. CyA alone 8.33 ± 0.56% vs. control 6.57 ± 0.62% vs. enalaprilat alone 5.55 ± 0.93%, p 0.05) and PTC secretion of TGFβ sub /sub (0.71 ± 0.11, 1.13 ± 0.09, 0.89 ± 0.07, and 0.67 ± 0.09 ng/mg protein/day, respectively, p 0.05). However, the other manifestations of CyA toxicity were not significantly reversed by concomitant enalaprilat administration. i Conclusions: /i ACE inhibition directly prevents CyA-induced interstitial fibrosis, but not proximal tubule cytotoxicity, independently of haemodynamic and systemic renin-angiotensin system effects. Renoprotection may be partially afforded by directly preventing the tubular secretion of TGFβ sub /sub .
Publisher: Elsevier BV
Date: 02-2008
Abstract: We compared survival and death-censored technique survival in patients on automated peritoneal dialysis (automated dialysis) or on continuous ambulatory peritoneal dialysis. All 4128 patients from the Australia and New Zealand Dialysis and Transplant Registry who started peritoneal dialysis over a 5-year period through March 2004 were included. Times to death and death-censored technique failure were analyzed by Cox proportional hazards models while a conditional risk set model computed technique failure. Compared to patients treated entirely with continuous ambulatory peritoneal dialysis, automated peritoneal dialysis patients were more likely to be young, Caucasian, have marginally lower body mass index, and were less likely to have baseline cardiovascular disease or diabetes. Using univariate and multivariate analysis, our study showed there were no significant differences in patient survival and death-censored technique failure between the two types of peritoneal dialysis modalities.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 2004
Publisher: Oxford University Press (OUP)
Date: 23-07-2019
DOI: 10.1093/NDT/GFY204
Abstract: Peritoneal dialysis (PD)-related infections lead to significant morbidity. The International Society for Peritoneal Dialysis (ISPD) guidelines for the prevention and treatment of PD-related infections are based on variable evidence. We describe practice patterns across facilities participating in the Peritoneal Dialysis Outcomes and Practice Patterns Study (PDOPPS). PDOPPS, a prospective cohort study, enrolled nationally representative s les of PD patients in Australia/New Zealand (ANZ), Canada, Thailand, Japan, the UK and the USA. Data on PD-related infection prevention and treatment practices across facilities were obtained from a survey of medical directors’. A total of 170 centers, caring for 000 patients, were included. The proportion of facilities reporting antibiotic administration at the time of PD catheter insertion was lowest in the USA (63%) and highest in Canada and the UK (100%). Exit-site antimicrobial prophylaxis was variably used across countries, with Japan (4%) and Thailand (28%) having the lowest proportions. Exit-site mupirocin was the predominant exit-site prophylactic strategy in ANZ (56%), Canada (50%) and the UK (47%), while exit-site aminoglycosides were more common in the USA (72%). Empiric Gram-positive peritonitis treatment with vancomycin was most common in the UK (88%) and USA (83%) compared with 10–45% elsewhere. Empiric Gram-negative peritonitis treatment with aminoglycoside therapy was highest in ANZ (72%) and the UK (77%) compared with 10–45% elsewhere. Variation in PD-related infection prevention and treatment strategies exist across countries with limited uptake of ISPD guideline recommendations. Further work will aim to understand the impact these differences have on the wide variation in infection risk between facilities and other clinically relevant PD outcomes.
Publisher: Elsevier BV
Date: 11-2016
DOI: 10.1053/J.AJKD.2016.05.015
Abstract: Guidelines preferentially recommend noncalcium phosphate binders in adults with chronic kidney disease (CKD). We compare and rank phosphate-binder strategies for CKD. Network meta-analysis. Adults with CKD. Randomized trials with allocation to phosphate binders. Sevelamer, lanthanum, iron, calcium, colestilan, bixalomer, nicotinic acid, and magnesium. The primary outcome was all-cause mortality. Additional outcomes were cardiovascular mortality, myocardial infarction, stroke, adverse events, serum phosphorus and calcium levels, and coronary artery calcification. 77 trials (12,562 participants) were included. Most (62 trials in 11,009 patients) studies were performed in a dialysis population. Trials were generally of short duration (median, 6 months) and had high risks of bias. All-cause mortality was ascertained in 20 studies during 86,744 patient-months of follow-up. There was no evidence that any drug class lowered mortality or cardiovascular events when compared to placebo. Compared to calcium, sevelamer reduced all-cause mortality (OR, 0.39 95% CI, 0.21-0.74), whereas treatment effects of lanthanum, iron, and colestilan were not significant (ORs of 0.78 [95% CI, 0.16-3.72], 0.37 [95% CI, 0.09-1.60], and 0.55 [95% CI, 0.07-4.43], respectively). Lanthanum caused nausea, whereas sevelamer posed the highest risk for constipation and iron caused diarrhea. All phosphate binders lowered serum phosphorus levels to a greater extent than placebo, with iron ranked as the best treatment. Sevelamer and lanthanum posed substantially lower risks for hypercalcemia than calcium. Limited testing of consistency short follow-up. There is currently no evidence that phosphate-binder treatment reduces mortality compared to placebo in adults with CKD. It is not clear whether the higher mortality with calcium versus sevelamer reflects whether there is net harm associated with calcium, net benefit with sevelamer, both, or neither. Iron-based binders show evidence of greater phosphate lowering that warrants further examination in randomized trials.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 05-2013
DOI: 10.2215/CJN.08770812
Abstract: This study aimed to evaluate dialysis and transplant outcomes of patients with ESRD secondary to ANCA-associated vasculitis (AAV). All ESRD patients who commenced renal replacement therapy in Australia and New Zealand between 1996 and 2010 were included. Outcomes were assessed by Kaplan–Meier, multivariable Cox regression, and competing-risks regression survival analyses. Of 36,884 ESRD patients, 228 had microscopic polyangiitis (MPA) and 221 had granulomatosis with polyangiitis (GPA). Using competing-risks regression, compared with other causes of ESRD, MPA patients (hazard ratio [HR], 0.89 95% confidence interval [95% CI], 0.73–1.08 P =0.24) and GPA patients (HR, 0.94 95% CI, 0.74–1.19 P =0.62) experienced comparable survival on dialysis. Forty-six MPA patients (21%) and 47 GPA (20%) patients received 98 renal allografts. Respective 10-year first graft survival rates in MPA, GPA, and non-AAV patients were 50%, 62%, 70%, whereas patient survival rates were 68%, 85% and 83%, respectively. Compared with non-AAV patients, MPA transplant recipients had higher risks of graft failure (HR, 1.87 95% CI, 1.07–3.25 P =0.03) and death (HR, 1.94 95% CI, 1.02–3.69 P =0.04), whereas GPA transplant recipients experienced comparable renal allograft survival (HR, 0.91 95% CI, 0.43–1.93 P =0.81) and patient survival (HR, 0.58 95% CI, 0.23–2.27 P =0.58). AAV recurrence was observed in two renal allografts (2%). Compared with ESRD patients without AAV, those with GPA have comparable renal replacement therapy outcomes, whereas MPA patients have comparable dialysis survival but poorer renal transplant allograft and patient survival rates.
Publisher: Wiley
Date: 19-07-2001
Publisher: Oxford University Press (OUP)
Date: 12-01-2018
DOI: 10.1093/NDT/GFX356
Abstract: Differences in the epidemiology of post-transplant lymphoproliferative disease (PTLD) between adult and paediatric kidney transplant recipients remain unclear. Using the Australian and New Zealand Dialysis and Transplant Registry (1963-2015), the cumulative incidences of PTLD in children (age <20 years) and adults were calculated using a competing risk of death model and compared with age-matched population-based data using standardized incidence ratios (SIRs). Risk factors for PTLD were assessed using Cox proportional hazards regression. Among 23 477 patients (92% adult, 60% male), 505 developed PTLD, with 50 (10%) occurring in childhood recipients. The 25-year cumulative incidence of PTLD was 3.3% [95% confidence interval (CI) 2.9-3.6] for adult recipients and 3.6% (95% CI 2.7-4.8) for childhood recipients. Childhood recipients had a 30-fold increased risk of lymphoma compared with the age-matched general population [SIR 29.5 (95% CI 21.9-38.8)], higher than adult recipients [SIR 8.4 (95% CI 7.7-9.2)]. Epstein-Barr virus (EBV)-negative recipient serology [adjusted hazard ratio (aHR) 3.33 (95% CI 2.21-5.01), P < 0.001], year of transplantation [aHR 0.93 for each year after the year 2000 (95% CI 0.88-0.99), P = 0.02], induction with an agent other than anti-CD25 monoclonal antibody [aHR 2.07 (95% CI 1.16-3.70), P = 0.01] and having diabetes [aHR 3.49 (95% CI 2.26-5.38), P < 0.001] were independently associated with PTLD. Lymphoma occurs at similar rates in adult and paediatric recipients, but has been decreasing since the year 2000. EBV-negative patients and those with diabetes or induction agent other than anti-CD25 monoclonal antibody are at substantially increased risk of PTLD.
Publisher: Therapeutic Guidelines Limited
Date: 02-2017
Publisher: Elsevier BV
Date: 11-2011
DOI: 10.1053/J.JRN.2010.12.002
Abstract: To investigate the effect of dietitian involvement in a multidisciplinary lifestyle intervention comparing risk factor modification for cardiovascular disease with standard posttransplant care in renal transplant recipients (RTR) with abnormal glucose tolerance (AGT). Randomized controlled trial. Hospital outpatient department. Adult RTR with AGT. RTR with AGT were randomized to a lifestyle intervention that consisted of either regular consultations with the dietitian and multidisciplinary team or standard care. Dietary intake, physical activity (PA) levels, cardiorespiratory fitness (CF), and anthropometry. Total fat and percent saturated fat intake rates were significantly lower in the intervention group as compared with the control group at 2-year follow-up, 54 g (16 to 105 g) versus 65 g (34 to 118 g), P = .01 and 10% (5% to 17%) versus 13% (4% to 20%), P = .05., respectively. There was a trend for an overweight (but not obese) in idual to lose more weight in the intervention group (4% loss vs. a gain of 0.25% at the 2-year follow-up). Overall, RTR were significantly less fit than age- and gender-matched controls, mean peak oxygen uptake was 19.42 ± 7.09 mL/kg per minute versus 28.35 ± 8.80 mL/kg per minute, P = .000. Simple exercise advice was not associated with any improvement in total PA or CF in either group at the 2-year follow-up. Dietary advice can contribute to healthier eating habits and a trend for weight loss in RTR with AGT. These improvements in conjunction with multidisciplinary care and pharmacological treatment can lead to improvements in cardiovascular risk factors such as lipid profile. Simple advice to increase PA was not effective in improving CF and other measures are needed.
Publisher: Elsevier BV
Date: 09-2023
Publisher: Wiley
Date: 17-04-2012
DOI: 10.1111/J.1440-1797.2012.01572.X
Abstract: Chronic kidney disease (CKD) is a common and serious problem that adversely affects human health, limits longevity and increases costs to health-care systems worldwide. Its increasing incidence cannot be fully explained by traditional risk factors. Oxidative stress is prevalent in CKD patients and is considered to be an important pathogenic mechanism. Oxidative stress develops from an imbalance between free radical production often increased through dysfunctional mitochondria formed with increasing age, type 2 diabetes mellitus, inflammation, and reduced anti-oxidant defences. Perturbations in cellular oxidant handling influence downstream cellular signalling and, in the kidney, promote renal cell apoptosis and senescence, decreased regenerative ability of cells, and fibrosis. These factors have a stochastic deleterious effect on kidney function. The majority of studies investigating anti-oxidant treatments in CKD patients show a reduction in oxidative stress and many show improved renal function. Despite heterogeneity in the oxidative stress levels in the CKD population, there has been little effort to measure patient oxidative stress levels before the use of any anti-oxidants therapies to optimize outcome. This review describes the development of oxidative stress, how it can be measured, the involvement of mitochondrial dysfunction and the molecular pathways that are altered, the role of oxidative stress in CKD pathogenesis and an update on the amelioration of CKD using anti-oxidant therapies.
Publisher: Wiley
Date: 15-02-2006
Publisher: Oxford University Press (OUP)
Date: 13-07-2004
Publisher: Elsevier BV
Date: 10-2016
DOI: 10.1053/J.AJKD.2016.03.418
Abstract: Left ventricular mass (LVM) is a widely used surrogate end point in randomized trials involving people with chronic kidney disease (CKD) because treatment-induced LVM reductions are assumed to lower cardiovascular risk. The aim of this study was to assess the validity of LVM as a surrogate end point for all-cause and cardiovascular mortality in CKD. Systematic review and meta-analysis. Participants with any stages of CKD. Randomized controlled trials with 3 or more months' follow-up that reported LVM data. Any pharmacologic or nonpharmacologic intervention. The surrogate outcome of interest was LVM change from baseline to last measurement, and clinical outcomes of interest were all-cause and cardiovascular mortality. Standardized mean differences (SMDs) of LVM change and relative risk for mortality were estimated using pairwise random-effects meta-analysis. Correlations between surrogate and clinical outcomes were summarized across all interventions combined using bivariate random-effects Bayesian models, and 95% credible intervals were computed. 73 trials (6,732 participants) covering 25 intervention classes were included in the meta-analysis. Overall, risk of bias was uncertain or high. Only 3 interventions reduced LVM: erythropoiesis-stimulating agents (9 trials SMD, -0.13 95% CI, -0.23 to -0.03), renin-angiotensin-aldosterone system inhibitors (13 trials SMD, -0.28 95% CI, -0.45 to -0.12), and isosorbide mononitrate (2 trials SMD, -0.43 95% CI, -0.72 to -0.14). All interventions had uncertain effects on all-cause and cardiovascular mortality. There were weak and imprecise associations between the effects of interventions on LVM change and all-cause (32 trials 5,044 participants correlation coefficient, 0.28 95% credible interval, -0.13 to 0.59) and cardiovascular mortality (13 trials 2,327 participants correlation coefficient, 0.30 95% credible interval, -0.54 to 0.76). Limited long-term data, suboptimal quality of included studies. There was no clear and consistent association between intervention-induced LVM change and mortality. Evidence for LVM as a valid surrogate end point in CKD is currently lacking.
Publisher: American Physiological Society
Date: 10-2016
DOI: 10.1152/AJPRENAL.00042.2016
Abstract: Cyclic nucleotide signal transduction pathways are an emerging research field in kidney disease. Activated cell surface receptors transduce their signals via intracellular second messengers such as cAMP and cGMP. There is increasing evidence that regulation of the cGMP-cGMP-dependent protein kinase 1-phosphodiesterase (cGMP-cGK1-PDE) signaling pathway may be renoprotective. Selective PDE5 inhibitors have shown potential in treating kidney fibrosis in patients with chronic kidney disease (CKD), via their downstream signaling, and these inhibitors also have known activity as antithrombotic and anticancer agents. This review gives an outline of the cGMP-cGK1-PDE signaling pathways and details the downstream signaling and regulatory functions that are modulated by cGK1 and PDE inhibitors with regard to antifibrotic, antithrombotic, and antitumor activity. Current evidence that supports the renoprotective effects of regulating cGMP-cGK1-PDE signaling is also summarized. Finally, the effects of icariin, a natural plant extract with PDE5 inhibitory function, are discussed. We conclude that regulation of cGMP-cGK1-PDE signaling might provide novel, therapeutic strategies for the worsening global public health problem of CKD.
Publisher: Wiley
Date: 23-04-2017
Publisher: Dustri-Verlgag Dr. Karl Feistle
Date: 2003
DOI: 10.5414/CNP59056
Abstract: Valproate intoxication is a relatively common clinical problem that can result in coma, respiratory depression, pancytopenia, hemodynamic instability and death [Fernandez et al. 1996, Franssen et al. 1999]. The drug's relatively low molecular weight, small volume of distribution and saturable protein-binding render it potentially amenable to extracorporeal removal (hemofiltration, hemodialysis or hemoperfusion), but published experience is scarce. This report describes a woman with a potentially fatal sodium valproate overdose, who did not respond to continuous veno-venous hemodiafiltration, but was successfully treated with low-flux hemodialysis. Based on our experience, we recommend hemodialysis for serious valproate intoxication.
Publisher: Public Library of Science (PLoS)
Date: 17-01-2019
Publisher: Elsevier BV
Date: 06-2016
DOI: 10.1016/J.KINT.2016.02.014
Abstract: Patient outcomes in end-stage kidney disease (ESKD) secondary to lupus nephritis have not been well described. To help define this we compared dialysis and transplant outcomes of patients with ESKD due to lupus nephritis to all other causes. All patients diagnosed with ESKD who commenced renal replacement therapy in Australia and New Zealand (1963-2012) were included. Clinical outcomes were evaluated in both a contemporary cohort (1998-2012) and the entire 50-year cohort. Of 64,160 included patients, 744 had lupus nephritis as the primary renal disease. For the contemporary cohort of 425 patients with lupus nephritis, the 5-year dialysis patient survival rate was 69%. Of 176 contemporary patients with lupus nephritis who received their first renal allograft, the 5-year patient, overall renal allograft, and death-censored renal allograft survival rates were 95%, 88%, and 93%, respectively. Patients with lupus nephritis had worse dialysis patient survival (adjusted hazard ratio 1.33, 95% confidence interval 1.12-1.58) and renal transplant patient survival (adjusted hazard ratio 1.87, 95% confidence interval 1.18-2.98), but comparable overall renal allograft survival (adjusted hazard ratio 1.19, 95% confidence interval 0.84-1.68) and death-censored renal allograft survival (adjusted hazard ratio 1.05, 95% confidence interval 0.68-1.62) compared with ESKD controls. Similar results were found in the entire cohort and when using competing-risks analysis. Thus, the ESKD of lupus nephritis was associated with worse dialysis and transplant patient survival but comparable renal allograft survival compared with other causes of ESKD.
Publisher: Elsevier BV
Date: 09-1998
Publisher: Oxford University Press (OUP)
Date: 11-2008
DOI: 10.1093/NDT/GFN320
Publisher: Oxford University Press (OUP)
Date: 18-12-2009
DOI: 10.1093/NDT/GFN684
Abstract: The impact of dialysis modality on the rates and types of infectious complications has not been well studied. The aim of the present investigation was to evaluate the rates of hepatitis C virus (HCV) and hepatitis B virus (HBV) infections in peritoneal dialysis (PD) and haemodialysis (HD) patients in the Asia-Pacific region. The study included the most recent period-prevalent data recorded in the national or regional dialysis registries of the 10 Asia-Pacific countries/areas (Australia, New Zealand, Japan, China, Taiwan, Korea, Thailand, Hong Kong, Malaysia and India), where such data were available. Longitudinal data were also available for all incident Australian and New Zealand patients commencing dialysis between 1 April 1995 and 31 December 2005. Rates of HCV and HBV infections were compared by chi-square, Poisson regression and Kaplan-Meier survival analyses, as appropriate. Data were obtained on 201,590 patients (HD 173,788 PD 27,802). HCV seroprevalences ranged between 0.7% and 18.1% across different countries and were generally higher in HD versus PD populations (7.9% +/- 5.5% versus 3.0% +/- 2.0%, P = 0.01). Seroconversion rates on dialysis were also significantly higher in HD patients (incidence rate ratio PD versus HD 0.33, 95% CI 0.13-0.75). HCV infection was highly predictive of mortality in Japan (relative risk 1.37, 95% CI 1.15-1.62, P = 0.003) and in Australia and New Zealand (adjusted hazards ratio 1.29, 95% CI 1.05-1.58). HBV infection data were limited, but less clearly influenced by dialysis modality. Dialysis modality selection significantly influences the risk of HCV infection experienced by end-stage renal failure patients in the Asia-Pacific region. No such association could be identified for HBV infection.
Publisher: BMJ
Date: 20-03-2015
Publisher: AMPCo
Date: 09-2018
DOI: 10.5694/MJA18.00544
Publisher: American Medical Association (AMA)
Date: 08-2017
Publisher: Oxford University Press (OUP)
Date: 30-05-2006
DOI: 10.1093/NDT/GFL268
Abstract: The aim of the present study was to determine whether the deceased donor kidney side (left or right kidney) was predictive of subsequent kidney transplant outcomes. A retrospective analysis was undertaken of the left-right deceased donor kidney pairs transplanted into recipients with end-stage renal failure in Queensland between 1 April 1994 and 31 March 2004. A total of 201 left-right deceased donor kidney pairs were transplanted into 402 patients. The baseline characteristics of the recipients in the two groups were comparable, except that the patients receiving right kidneys had lower body mass indices and shorter cold ischaemic times. No differences were seen between the left and right kidney recipient groups with respect to operative duration (3.02 +/- 0.67 vs 3.12 +/- 0.72 h, P = 0.16), warm ischaemic time (0.62 +/- 0.18 vs 0.65 +/- 0.21, P = 0.09), delayed graft function (4 vs 6%, respectively, P = 0.26) or a composite vascular, haemorrhagic, ureteric and infective post-operative complication end-point (22 vs 22%, P = 0.90). Estimated glomerular filtration rates were almost identical at 1 month (52.7 +/- 39.6 vs 51.0 +/- 24.0 ml/min/1.73 m(2), P = 0.34) and remained comparable thereafter. Respective death-censored graft survival rates for left and right kidney recipients were 100 and 100% at 1 year, 99.4 and 96.4% at 3 years and 96.3 and 95.5% at 5 years, respectively (P = 0.67). Although left and right deceased donor kidneys present different operative challenges, the present results suggest that the probability of early post-operative complications, delayed graft function, impaired early and medium-term renal allograft function or death-censored graft failure is comparable between left and right kidney recipients.
Publisher: Wiley
Date: 22-05-2007
DOI: 10.1111/J.1440-1797.2007.00804.X
Abstract: Metabolic syndrome (MS) is a significant risk factor for cardiovascular disease, mortality and chronic kidney disease (CKD) in the general population. However, the prevalence, predictors, prognostic value and treatment of MS in the CKD population have not been rigorously studied. The study involved 200 stages 4 and 5 CKD patients enrolled in a randomized controlled trial of intensive multiple risk factor modification (targeting hypercholesterolaemia, hyperhomocysteinaemia, anaemia and disordered bone mineral metabolism) versus usual care. Participants were followed for a median period of 22 months. The overall prevalence of MS was 30.5%. MS was independently predicted by older age, peritoneal dialysis and Maori/Pacific Islander origin. When laboratory parameters were included as covariates, the only significant predictors of MS were higher serum malondialdehyde and lower serum adiponectin concentrations. MS was an independent predictor of time to composite end-point of cardiovascular death, acute coronary syndrome, revascularization, non-fatal stroke and utation (adjusted hazard ratio 2.46, 95% CI 1.17-5.18). No significant difference in cardiovascular event-free survival was observed in those allocated to intensive risk factor modification compared with usual care. Metabolic syndrome occurs in 30.5% of stages 4 and 5 CKD patients and is associated with older age, peritoneal dialysis, ethnicity, increased oxidative stress, lower serum adiponectin concentrations and a significantly increased risk of future cardiovascular events. Intervention strategies targeting hypercholesterolaemia, hyperhomocysteinaemia, anaemia and disordered bone mineral metabolism may not be effective in ameliorating the heightened cardiovascular risk of CKD patients with MS.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 27-10-2004
Publisher: Wiley
Date: 04-2019
DOI: 10.1111/IMJ.14244
Abstract: Data linkage is a valuable technique for uniting information from multiple sources that relates to the same person, place, family or event. Despite its value, establishing such linkages in Australia remains challenging. Existing policies are a missed opportunity for research and innovation and engender a negative attitude among researchers when considering data linkage as a research means. Greater leadership from the Population Health Research Network and cooperation from data custodians and Human Research Ethics Committees are necessary to access optimally Australia's enormous data potential.
Publisher: SAGE Publications
Date: 11-03-2021
Abstract: Despite the implementation of a ‘Peritoneal Dialysis (PD) First’ policy in Thailand since 2008, nationwide PD practices and patients’ outcomes have rarely been reported. As part of the multinational PD Outcomes and Practice Patterns Study (PDOPPS), PD patients from 22 PD centres from different geographic regions, sizes and affiliations, representing Thailand PD facilities, have been enrolled starting in May 2016. Demographic, clinical and laboratory data and patients’ outcomes were prospectively collected and analysed. The pilot and implementation phases demonstrated excellent concordance between study data and validation data collected at enrolment. In the implementation phase, 848 PD patients (including 262 (31%) incident PD patients) were randomly s led from 5090 patients in participating centres. Almost all participants (95%) performed continuous ambulatory PD (CAPD), and a high proportion had hypoalbuminemia (67%, serum albumin 3.5 g/dL), anaemia (42%, haemoglobin g/dL) and hypokalaemia (37%, serum potassium 3.5 mmol/L). The peritonitis rate was 0.40 episodes/year, but the culture-negative rate was high (0.13 episodes/year, 28% of total episodes). The patients from PD clinics located in Bangkok metropolitan region had higher socio-economic status, more optimal nutritional markers, blood chemistries, haemoglobin level and lower peritonitis rates compared to the provincial regions, emphasizing the centre effect on key success factors in PD. Participation in the PDOPPS helps unveil the critical barriers to improving outcomes of PD patients in Thailand, including a high prevalence of hypokalaemia, anaemia, poor nutritional status and culture-negative peritonitis. These factors should be acted upon to formulate solutions and implement quality improvement on a national level.
Publisher: Elsevier BV
Date: 2017
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 05-2005
Publisher: Wiley
Date: 23-12-2011
DOI: 10.1111/J.1440-1797.2010.01390.X
Abstract: Peritoneal dialysis technique survival in Australia and New Zealand is lower than in other parts of the world. More than two-thirds of technique failures are related to infective complications (predominantly peritonitis) and 'social reasons'. Practice patterns vary widely and more than one-third of peritoneal dialysis units do not meet the International Society of Peritoneal Dialysis minimum accepted peritonitis rate. In many cases, poor peritonitis outcomes reflect significant deviations from international guidelines. In this paper we propose a series of practical recommendations to improve outcomes in peritoneal dialysis patients through appropriate patient selection, prophylaxis and treatment of infectious complications, investigation of social causes of technique failure and a greater focus on patient education and clinical governance.
Publisher: Elsevier BV
Date: 05-2021
Publisher: Wiley
Date: 04-1998
DOI: 10.1111/J.1469-7793.1998.587BQ.X
Abstract: 1. In order to determine the role of the insulin-like growth factor-I (IGF-I)/IGF binding protein (IGFBP) axis in the augmentation of tubule growth and function following reductions in nephron mass, primary cultures of human proximal tubule cells (PTCs) were generated from the histologically normal sections of ten surgically removed kidneys. 2. PTC hypertrophy (cellular protein content), DNA synthesis (thymidine incorporation) and apical sodium-hydrogen exchange (NHE) activity (ethylisopropylamiloride-sensitive apical 22Na+ uptake) were measured following 24 h incubation in media supplemented with 10 % pre- or post-nephrectomy sera obtained from these patients. The results were compared with the effects of pre- and post-operative control sera collected from seven patients undergoing retroperitoneal operations not involving removal of renal tissue. 3. Day 1 post-nephrectomy sera promoted a significant 73 % increase in apical NHE activity, which was accompanied by a significant increase in PTC binding of 125I-IGF-I (post- vs. pre-nephrectomy, 163 +/- 6 vs. 142 +/- 4 fmol (mg protein)-1 P < 0.05). Subsequent post-nephrectomy sera significantly stimulated PTC protein content and thymidine incorporation, peaking at day 7 (127.7 +/- 14.0 and 118.4 +/- 9.0 % of pre-nephrectomy values, respectively P < 0.05). The growth effects were cell specific, as they were not observed with renal cortical fibroblasts. No change was detected in any of these measured variables following exposure to control sera. 4. Serum IGF-I and IGFBP-1 levels did not significantly change over time or between groups. IGFBP-3 levels progressively decreased in both control and nephrectomized sera from pre-operative values of 3580 +/- 305 and 3360 +/- 217 ng ml-1, respectively, to 2670 +/- 341 and 2600 +/- 347 ng ml-1 at 1 week post-operation. Serum IGFBP-2 levels increased to a comparable extent in both controls (day 0 vs. day 7, 2940 +/- 1024 vs. 7010 +/- 2520 ng ml-1 P < 0.01) and nephrectomized patients (day 0 vs. day 7, 3070 +/- 656 vs. 9130 +/- 2010 ng ml-1 P < 0.01). 5. The results indicate that nephrectomy engenders the elaboration of one or more humoral factor(s), which promotes increased binding of IGF-I to PTCs and which may in turn specifically stimulate PTC Na+ transport and growth.
Publisher: Springer Science and Business Media LLC
Date: 09-06-2011
DOI: 10.1007/S00228-011-1071-Y
Abstract: The aims of this study were to evaluate the predictive performances of all previously derived limited s ling strategies (LSSs) for total prednisolone, and to derive LSSs for free prednisolone in an independent cohort of adult kidney transplant recipients. Total and free prednisolone area under the concentration-time curve profiles from 0 to 12 hours post-dose (AUC(0-12)) were collected from 20 subjects. All previously published total prednisolone LSSs were identified from the literature. Free prednisolone LSSs were developed using multiple linear regression analyses. AUC predicted by each of the LSSs was compared with AUC(0-12). Median percentage prediction error (MPPE) and median absolute percentage prediction error (MAPE) were calculated to evaluate bias and imprecision. Total dose-adjusted prednisolone exposure varied 5-fold among study participants, while free dose-adjusted prednisolone exposure varied 3-fold. Correlation (r²) between total and free prednisolone AUC(0-12) was 0.79 (p = <0.0001) for the entire study cohort. This correlation was poorer in those early compared with late post-transplant (r² = 0.42 (p = 0.04) versus r² = 0.59 (p = 0.009) respectively). Ten previously published LSSs for total prednisolone and 15 derived LSSs for free prednisolone performed with acceptable levels of bias and imprecision (<15%). Of the free prednisolone LSSs, an equation incorporating 0.25-, 2- and 4-h concentrations showed the highest predictive power (AUC₀-₁₂ = -17.20 + 1.18 × C₀.₂₅ + 2.75 × C₂ + 4.45 × C₄ MPPE = 0.1%, MAPE = 4.6%). Wide between-subject variability in drug exposure suggests a role for TDM. LSSs can accurately estimate both total and free prednisolone AUC(0-12). However, given the poor correlation observed between the two parameters, our data suggest that free prednisolone LSSs may be preferable.
Publisher: Springer Science and Business Media LLC
Date: 20-10-2022
Publisher: Oxford University Press (OUP)
Date: 09-02-2009
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 04-2015
DOI: 10.2215/CJN.09060914
Publisher: Elsevier BV
Date: 08-2014
DOI: 10.1053/J.AJKD.2014.02.025
Abstract: Peritonitis is a common serious complication of peritoneal dialysis that results in considerable morbidity, mortality, and health care costs. It also significantly limits the use of this important dialysis modality. Despite its importance as a patient safety issue, peritonitis practices and outcomes vary markedly and unacceptably among different centers, regions, and countries. This article reviews peritonitis risk factors, diagnosis, treatment, and prevention, particularly focusing on potential drivers of variable practices and outcomes, controversial or unresolved areas, and promising avenues warranting further research. Potential strategies for augmenting the existing limited evidence base and reducing the gap between evidence-based best practice and actual practice also are discussed.
Publisher: Elsevier BV
Date: 05-1999
Publisher: Elsevier BV
Date: 09-2014
DOI: 10.1053/J.AJKD.2014.02.023
Abstract: Late referral for renal replacement therapy (RRT) leads to worse outcomes. In 2005, estimated glomerular filtration rate (eGFR) reporting began in Australasia, with an aim of substantially increasing earlier disease detection. Observational cohort study using the Australia and New Zealand Dialysis and Transplant Registry (ANZDATA) data. All patients commencing RRT in Australasia between January 1, 1999, and December 31, 2010. We excluded the period between December 31, 2004, and January 1, 2007, to allow for practice change. Introduction of eGFR reporting. Primary outcome was late referral defined as commencing RRT within 3 months of nephrology referral. Secondary outcomes included initial RRT modality and prepared access at hemodialysis therapy initiation. Late referral rates per era were determined and multilevel logistic regression was used to identify late referral predictors. We included 25,009 patients. Overall, 3,433 (25.3%) patients were referred late in the pre-eGFR era compared with 2,464 (21.6%) in the post-eGFR era, for an absolute reduction of 3.7% (95% CI, 2.7%-4.8% P<0.001). After adjustments for age, body mass index, race, comorbid conditions, and primary kidney disease, adjusted late referral rates were 25.8% (95% CI, 23.3%-28.3%) and 21.8% (95% CI, 19.2%-24.4%) in the pre- and post-eGFR eras, respectively, for a difference of 4.0% (95% CI, 1.2%-6.8% P=0.005). Late referral risk was attenuated significantly post-eGFR reporting (OR, 1.30 95% CI, 1.12-1.51) compared to pre-eGFR reporting (OR, 2.15 95% CI, 1.88-2.46) for indigenous patients. Late referral rates decreased for older patients but increased slightly for younger patients (P=0.001 for interaction between age and era). There was no impact on initial RRT modality or prepared access rates at hemodialysis therapy initiation between eras. Residual confounding could not be excluded. eGFR reporting was associated with small reductions in late referral, but more than 1 in 5 patients are still referred late. Other initiatives to increase timely referral warrant investigation.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 28-09-2018
DOI: 10.2215/CJN.06560518
Publisher: Wiley
Date: 06-09-2018
Publisher: Wiley
Date: 2010
DOI: 10.1111/J.1542-4758.2009.00419.X
Abstract: Hemodialysis has been associated with reduced quality of life (QOL). Small cohort studies of quotidian hemodialysis regimens suggest general QOL and dialysis-related symptoms may improve compared with conventional regimens. An observational cohort study was conducted on 63 patients (age 51.7 +/- 12.9 years 79.4% male 33.3% diabetes duration of renal replacement therapy 1.9 [0.7-6.4] years) converted from conventional home hemodialysis (3-5 sessions weekly, 3-6 h/session) to home nocturnal home hemodialysis (NHD) (3-5 sessions weekly, 6-10 h/session). Kidney Disease Quality of Life (KDQOL) and Assessment of Quality of Life instruments and 6-minute-walk tests were applied at baseline and 6 months. Baseline and 6 month surveys were returned by 70% of patients. On KDQOL, significant improvements in general health (P=0.02) and overall health ratings (P=0.0008), physical function (P=0.003), physical role (P=0.018), and energy and fatigue (P=0.027) were documented. There was a trend toward improvement in burden of kidney disease (P=0.05) and emotional role (P=0.066). There was a significant improvement in distance covered in the 6-minute-walk test from 513 m (420.5-576.4) to 536.5 m (459-609), P=0.007. On Assessment of Quality of Life, there was a trend toward improvement in overall utility score from 0.65 (0.39-0.81) to 0.73 (0.46-0.86), P=0.096. After 86.2 patient-years of observation, 23 patients have discontinued NHD (12 transplanted, 5 deceased, 4 psychosocial problems, 1 dialysis access problem, 1 medically unsuitable). Nocturnal home hemodialysis is a sustainable therapy. In addition to improving general QOL, alternate nightly NHD can significantly improve physical functioning as measured by KDQOL and 6-minute-walk tests.
Publisher: Elsevier BV
Date: 04-2006
DOI: 10.1016/J.LAB.2005.11.011
Abstract: Chronic kidney disease (CKD) is an increasingly common condition with limited treatment options that is placing a major financial and emotional burden on the community. The use of complementary and alternative medicines (CAMS) has increased many-fold over the past decade. Although several compelling studies show renal toxicities and an adverse outcome from use of some CAMS, there is also emerging evidence in the literature that some may be renoprotective. Many nephrologists are unaware of these potential therapeutic benefits in treating CKD, or they are reluctant to consider them in research trials for fear of adverse effects (including nephrotoxicity) or deleterious interaction with co-prescribed, conventional medicines. The increased use of self-prescribed CAMS by their patients suggests that practitioners and researchers should keep abreast of the current information on these agents. A primary goal of this article was to review the available scientific evidence for the use of herbs or natural substances as a complementary treatment for patients with CKD. A further goal was to report the literature on herbs that have been reported to cause kidney failure.
Publisher: Oxford University Press (OUP)
Date: 23-05-2016
DOI: 10.1093/NDT/GFW092A
Publisher: Wiley
Date: 28-06-2011
Publisher: Elsevier BV
Date: 2013
DOI: 10.1053/J.AJKD.2012.07.008
Abstract: There has been little study to date of daily variation in cardiac death in dialysis patients and whether such variation differs according to dialysis modality and session frequency. Observational cohort study using ANZDATA (Australia and New Zealand Dialysis and Transplant) Registry data. All adult patients with end-stage kidney failure treated by dialysis in Australia and New Zealand who died between 1999 and 2008. Timing of death (day of week), dialysis modality, hemodialysis (HD) session frequency, and demographic, clinical, and facility variables. Cardiac and noncardiac mortality. 14,636 adult dialysis patients died during the study period (HD, n = 10,338 peritoneal dialysis [PD], n = 4,298). Cardiac death accounted for 40% of deaths and was significantly more likely to occur on Mondays in in-center HD patients receiving 3 or fewer dialysis sessions per week (n = 9,503 adjusted OR, 1.26 95% CI, 1.14-1.40 P < 0.001 compared with the mean odds of cardiac death for all days of the week). This daily variation in cardiac death was not seen in PD patients, in-center HD patients receiving more than 3 sessions per week (n = 251), or home HD patients (n = 573). Subgroup analyses showed that deaths related to hyperkalemia and myocardial infarction also were associated with daily variation in risk in HD patients. This pattern was not seen for vascular, infective, malignant, dialysis therapy withdrawal, or other deaths. Limited covariate adjustment. Residual confounding and coding bias could not be excluded. Possible type 2 statistical error due to limited s le size of home HD and enhanced-frequency HD cohorts. Daily variation in the pattern of cardiac deaths was observed in HD patients receiving 3 or fewer dialysis sessions per week, but not in PD, home HD, and HD patients receiving more than 3 sessions per week.
Publisher: Wiley
Date: 25-10-2020
DOI: 10.1111/JSR.13221
Abstract: Sleep disturbances are common among patients receiving dialysis and are associated with an increased risk of mortality and morbidity, and impaired quality of life. Despite being highly prioritised by patients, sleep problems remain under‐diagnosed and inadequately managed. The aim of the present study was to describe the perspectives of patients receiving dialysis and their caregivers on sleep. We extracted qualitative data on sleep from 26 focus groups, two international Delphi surveys, and two consensus workshops involving 644 patients and caregivers from 86 countries as part of the Standardised Outcomes in Nephrology‐Haemodialysis and ‐Peritoneal Dialysis (SONG‐HD/SONG‐PD) initiatives. The responses were from patients aged ≥18 years receiving haemodialysis or peritoneal dialysis, and their caregivers. We analysed the data using thematic analysis with five themes identified: constraining daily living (with subthemes of: battling intrusive tiredness, exacerbating debilitating conditions, broken and incapacitated) roadblocks in relationships (unable to meet family needs, antipathy due to misunderstanding, wreaking emotional havoc) burden on caregivers (stress on support persons, remaining alert to help) losing enjoyment (limiting social contact, disempowerment in life) and undermining mental resilience (aggravating low mood, diminishing coping skills, reducing functional ability). Sleep disturbances are exhausting for patients on dialysis and pervade all aspects of their lives including the ability to do daily tasks, and maintaining relationships, mental and emotional well‐being. Better assessment and management of sleep problems in dialysis is needed, which may lead to improvements in overall health and quality of life.
Publisher: Elsevier BV
Date: 04-2019
DOI: 10.1016/J.KINT.2018.12.005
Abstract: The global nephrology community recognizes the need for a cohesive strategy to address the growing problem of end-stage kidney disease (ESKD). In March 2018, the International Society of Nephrology hosted a summit on integrated ESKD care, including 92 in iduals from around the globe with erse expertise and professional backgrounds. The attendees were from 41 countries, including 16 participants from 11 low- and lower-middle-income countries. The purpose was to develop a strategic plan to improve worldwide access to integrated ESKD care, by identifying and prioritizing key activities across 8 themes: (i) estimates of ESKD burden and treatment coverage, (ii) advocacy, (iii) education and training/workforce, (iv) financing/funding models, (v) ethics, (vi) dialysis, (vii) transplantation, and (viii) conservative care. Action plans with prioritized lists of goals, activities, and key deliverables, and an overarching performance framework were developed for each theme. Ex les of these key deliverables include improved data availability, integration of core registry measures and analysis to inform development of health care policy a framework for advocacy improved and continued stakeholder engagement improved workforce training equitable, efficient, and cost-effective funding models greater understanding and greater application of ethical principles in practice and policy definition and application of standards for safe and sustainable dialysis treatment and a set of measurable quality parameters and integration of dialysis, transplantation, and comprehensive conservative care as ESKD treatment options within the context of overall health priorities. Intended users of the action plans include clinicians, patients and their families, scientists, industry partners, government decision makers, and advocacy organizations. Implementation of this integrated and comprehensive plan is intended to improve quality and access to care and thereby reduce serious health-related suffering of adults and children affected by ESKD worldwide.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 02-2019
Publisher: Wiley
Date: 04-07-2007
Publisher: Wiley
Date: 07-05-2007
DOI: 10.1111/J.1440-1797.2007.00795.X
Abstract: Residual renal function (RRF) is not currently emphasized for patients undergoing haemodialysis (HD). The role of RRF is well recognized in the peritoneal dialysis population as studies have clearly demonstrated a survival benefit with preservation of RRF. There is however, data to suggest that RRF is important in HD patients as well. Contemporary HD therapies using high flux biocompatible synthetic dialysers, bicarbonate buffered ultrapure dialysis fluids with ultrafiltration control appear to allow better preservation of RRF. The long held belief that peritoneal dialysis is better at preserving RRF than HD may no longer be true. More robust studies are required to determine the relative importance of RRF in HD and strategies to best preserve this vital asset.
Publisher: Elsevier BV
Date: 04-2005
DOI: 10.1111/J.1523-1755.2005.00209.X
Abstract: Fibrin deposition is frequently observed within the tubulointerstitium in various forms of chronic renal disease. This suggests the presence of active components of the coagulation pathway, which may contribute to the progressive deterioration in renal function. The aim of this study was to investigate the proinflammatory and fibroproliferative effects of the coagulation protease thrombin on human proximal tubular cells (PTC) in culture. Primary cultures of PTC were established from normal kidney tissue and grown under serum-free conditions with or without thrombin or the protease-activated receptor (PAR) activating peptides TFLLRN-NH(2), SLIGKV-NH(2), and SFLLRN-NH(2) (100 to 400 micromol/L). DNA synthesis (thymidine incorporation), intracellular Ca(2+) mobilization (fura-2 fluorimetry), fibronectin secretion [enzyme-linked immunosorbent assay (ELISA), immunoblotting], monocyte chemoattractant protein-1 (MCP-1) secretion (ELISA), and transforming growth factor-beta1 (TGF-beta1) secretion (ELISA) were measured. Reverse transcription-polymerase chain reaction (RT-PCR) was used to assess PAR mRNA expression in these cells. Thrombin enhanced DNA synthesis, fibronectin secretion, MCP-1 secretion, and TGF-beta1 secretion in a concentration-dependent manner. Cell injury [lactate dehydrogenase (LDH) release] and cellular protein levels were unaffected. RT-PCR showed that cultures of PTC expressed mRNA transcripts for the thrombin receptors PAR-1 and PAR-3, but not PAR-4. Thrombin and each of the PAR activating peptides enhanced intracellular calcium mobilization. However, the other effects of thrombin were only fully reproduced by the PAR-2-specific peptide, SLIGKV-NH(2), only partially by SFLLRN-NH(2), (a PAR-1 peptide that can activate PAR-2), and not at all by the PAR-1-specific peptide, TFLLRN-NH(2). Thrombin-induced DNA synthesis, fibronectin, and MCP-1 secretion were unaffected by a TGF-beta neutralizing antibody, the matrix metalloproteinase (MMP) inhibitor, GM6001 and the epidermal growth factor (EGF) receptor kinase inhibitor AG1478. Thrombin initiates both proinflammatory and fibroproliferative responses in human PTC. These responses which are dependent on its protease activity appear not to be mediated by PAR-1 activation, the autocrine action of thrombin-induced TGF-beta1 secretion, MMP activation, or EGF receptor transactivation. The proinflammatory and fibroproliferative actions of thrombin on human PTC may help explain the extent of tubulointerstitial fibrosis observed in kidney diseases where fibrin deposition is evident.
Publisher: SAGE Publications
Date: 05-2003
DOI: 10.1177/089686080302300311
Abstract: The aim of this study was to prospectively evaluate the risk factors for decline of residual renal function (RRF) in an incident peritoneal dialysis (PD) population. Prospective observational study of an incident PD cohort at a single center. Tertiary-care institutional dialysis center. The study included 146 consecutive patients commencing PD at the Princess Alexandra Hospital between 1 August 1995 and 1 July 2001 (mean age 54.8 ± 1.4 years, 42% male, 34% diabetic). Patients with failed renal transplants ( n = 26) were excluded. Timed urine collections ( n = 642) were performed initially and at 6-month intervals thereafter to measure RRF. The development of anuria was also prospectively recorded. The mean (±SD) follow-up period was 20.5 ± 14.8 months. The median slope of RRF decline was –0.05 mL/minute/month/1.73 m 2 . Using binary logistic regression, it was shown that the 50% of patients with more rapid RRF loss ( –0.05 mL/min/month/1.73 m 2 ) were more likely to have had a higher initial RRF at commencement of PD [adjusted odds ratio (AOR) 1.83, 95% confidence interval (CI) 1.39 – 2.40] and a higher baseline dialysate lasma creatinine ratio at 4 hours (D/P creat AOR 44.6, 95% CI 1.05 – 1900). On multivariate Cox proportional hazards model analysis, time from commencement of PD to development of anuria was independently predicted by baseline RRF [adjusted hazard ratio (HR) 0.81, 95% CI 0.60 – 0.81], D/P creat (HR 2.87, 95% CI 2.06 – 82.3), body surface area (HR 6.23, 95% CI 1.53 – 25.5), dietary protein intake (HR 2.87, 95% CI 1.06 – 7.78), and diabetes mellitus (HR 1.65, 95% CI 1.00 – 2.72). Decline of RRF was independent of age, gender, dialysis modality, urgency of initiation of dialysis, smoking, vascular disease, blood pressure, medications (including angiotensin-converting enzyme inhibitors), duration of follow-up, and peritonitis rate. The results of this study suggest that high baseline RRF and high D/P creat ratio are risk factors for rapid loss of RRF. Moreover, a shorter time to the onset of anuria is independently predicted by low baseline RRF, increased body surface area, high dietary protein intake, and diabetes mellitus. Such at-risk patients should be closely monitored for early signs of inadequate dialysis.
Publisher: Elsevier BV
Date: 12-1997
DOI: 10.1038/KI.1997.479
Abstract: To determine the paracrine interactions involved in the tubulointerstitial response to progressive renal disease, the role of insulin-like growth factor-I (IGF-I) and its binding proteins (IGFBPs) in in vitro interactions between human proximal tubule cells (PTC) and renal cortical fibroblasts (CF) were studied in primary cell culture. PTC growth and transport were increased in the presence of CF-conditioned media (CF-CM), as shown by increased thymidine incorporation, cellular protein content and sodium-hydrogen exchange (NHE) activity, to 185 +/- 31% (P < 0.01), 150 +/- 18% (P < 0.05) and 195 +/- 27% (P < 0.01) of the control values, respectively. IGF-I was produced by cultured CF at a rate of 64.6 +/- 7.5 ng/mg protein/day. Exogenous IGF-I applied to PTC provoked similar enhancement of growth and NHE activity as CF-CM and the stimulatory effect of CF-CM was blocked by specific immunoneutralization of IGF-I receptors. These receptors were threefold more abundant on PTC basolateral versus apical membranes. IGF binding proteins (IGFBP)-2 and IGFBP-3 were secreted by CF at rates of 694 +/- 88 and 1769 +/- 45 ng/mg/day, with the release of IGFBP-3 being enhanced in the presence of PTC-CM (120.0 +/- 9.7% of control, P < 0.01). Moreover, the addition of CF-CM to PTC increased cell-associated IGFBP-3 on PTC surfaces, without changes in IGF-I receptor numbers or affinity and without changes in PTC mRNA for IGFBP-3. Des(1-3)IGF-I, an analog that binds to the IGF-I receptor but not to IGFBPs, provided a less potent stimulus for PTC growth compared with IGF-I, indicating that cell-associated IGFBP-3 facilitates the action of IGF-I on PTC. The results support important paracrine roles for both IGF-I and IGFBPs in the interstitial regulation of proximal tubule growth and transport.
Publisher: SAGE Publications
Date: 03-2022
DOI: 10.1177/08968608221080586
Abstract: Peritoneal dialysis (PD)-associated peritonitis is a serious complication of PD and prevention and treatment of such is important in reducing patient morbidity and mortality. The ISPD 2022 updated recommendations have revised and clarified definitions for refractory peritonitis, relapsing peritonitis, peritonitis-associated catheter removal, PD-associated haemodialysis transfer, peritonitis-associated death and peritonitis-associated hospitalisation. New peritonitis categories and outcomes including pre-PD peritonitis, enteric peritonitis, catheter-related peritonitis and medical cure are defined. The new targets recommended for overall peritonitis rate should be no more than 0.40 episodes per year at risk and the percentage of patients free of peritonitis per unit time should be targeted at >80% per year. Revised recommendations regarding management of contamination of PD systems, antibiotic prophylaxis for invasive procedures and PD training and reassessment are included. New recommendations regarding management of modifiable peritonitis risk factors like domestic pets, hypokalaemia and histamine-2 receptor antagonists are highlighted. Updated recommendations regarding empirical antibiotic selection and dosage of antibiotics and also treatment of peritonitis due to specific microorganisms are made with new recommendation regarding adjunctive oral N-acetylcysteine therapy for mitigating aminoglycoside ototoxicity. Areas for future research in prevention and treatment of PD-related peritonitis are suggested.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 08-2004
DOI: 10.1097/00007691-200408000-00017
Abstract: A 58-year-old man with end-stage renal failure secondary to polycystic kidney disease developed a profoundly elevated mycophenolic acid (MPA) free fraction and associated severe toxicity after cadaveric renal transplantation. Initial immunosuppressive therapy was 4 mg/kg body weight bid cyclosporin (Neoral Novartis Pharmaceutical Co Ltd, Sydney, Australia) given orally with 1 g bid mycophenolate mofetil (MMF) (CellCept Roche Products Pty Ltd, Sydney, Australia). In the first 5 days posttransplantation, the serum creatinine concentration fell, and the patient developed profound hypoalbuminemia (serum albumin 150 micromol/L) that resulted from progressing biliary obstruction. On day 5 posttransplantation, the 2-hour whole-blood cyclosporin concentration and total MPA area under the curve (AUC(0-6)) were low (837 microg/L and 12.6 mg x h/L, respectively), while the total mycophenolic acid glucuronide (MPAG) AUC(0-6) was elevated (1317 mg x h/L). MMF was continued at the same dose, but tacrolimus substituted for cyclosporin. The patient subsequently experienced severe nausea, vomiting, hematemesis, and pancytopenia (nadir white cell count 1.6 x 10(9)/L, platelet count 32 x 10(9)/L, and hemoglobin 73 g/L) that were normalized after cessation of MMF. Retrospective measurement of the free MPA concentration on day 5 showed that free MPA AUC(0-6) was markedly elevated at 2.3 mg x h/L, as was the free fraction, at 18.3%. This case illustrates how altered protein binding can be associated with severe MMF toxicity caused by an increased free MPA concentration despite relatively low total MPA. These data support the monitoring of free MPA concentrations in those patients considered at risk for MMF-related toxicity.
Publisher: Wiley
Date: 26-11-2018
DOI: 10.1111/SDI.12658
Abstract: In patients receiving hemodialysis, the provision of safe and effective vascular access using an arteriovenous fistula or graft is regarded as a critical priority by patients and health professionals. Vascular access failure is associated with morbidity and mortality, such that strategies to prevent these outcomes are essential. Inadequate vascular remodeling and neointimal hyperplasia resulting in stenosis and frequently thrombosis are critical to the pathogenesis of access failure. Systemic medical therapies with pleiotropic effects including antiplatelet agents, omega-3 polyunsaturated fatty acids (fish oils), statins, and inhibitors of the renin-angiotensin-aldosterone system (RAAS) may reduce vascular access failure by promoting vascular access maturation and reducing stenosis and thrombosis through antiproliferative, antiaggregatory, anti-inflammatory and vasodilatory effects. Despite such promise, the results of retrospective analyses and randomized controlled trials of these agents on arteriovenous fistula and graft outcomes have been mixed. This review describes the current understanding of the pathogenesis of arteriovenous fistula and graft failure, the biological effects of antiplatelet agents, fish oil supplementation, RAAS blockers and statins that may be beneficial in improving vascular access survival, results from clinical trials that have investigated the effect of these agents on arteriovenous fistula and graft outcomes, and it explores future therapeutic approaches combining these agents with novel treatment strategies.
Publisher: Oxford University Press (OUP)
Date: 11-07-2016
DOI: 10.1093/NDT/GFV269
Abstract: Chronic kidney disease (CKD) is strongly associated with increased risks of progression to end-stage kidney disease (ESKD) and mortality. Clinical trials evaluating CKD progression commonly use a composite end point of death, ESKD or serum creatinine doubling. However, due to low event rates, such trials require large s le sizes and long-term follow-up for adequate statistical power. As a result, very few interventions targeting CKD progression have been tested in randomized controlled trials. To overcome this problem, the National Kidney Foundation and Food and Drug Administration conducted a series of analyses to determine whether an end point of 30 or 40% decline in estimated glomerular filtration rate (eGFR) over 2-3 years can substitute for serum creatinine doubling in the composite end point. These analyses demonstrated that these alternate kidney end points were significantly associated with subsequent risks of ESKD and death. However, the association between, and consistency of treatment effects on eGFR decline and clinical end points were influenced by baseline eGFR, follow-up duration and acute hemodynamic effects. The investigators concluded that a 40% eGFR decline is broadly acceptable as a kidney end point across a wide baseline eGFR range and that a 30% eGFR decline may be acceptable in some situations. Although these alternate kidney end points could potentially allow investigators to conduct shorter duration clinical trials with smaller s le sizes thereby generating evidence to guide clinical decision-making in a timely manner, it is uncertain whether these end points will improve trial efficiency and feasibility. This review critically appraises the evidence, strengths and limitations pertaining to eGFR end points.
Publisher: Wiley
Date: 08-04-2017
Publisher: Public Library of Science (PLoS)
Date: 25-03-2021
DOI: 10.1371/JOURNAL.PONE.0248983
Abstract: Expression of the protease sensing receptor, protease activated receptor-2 (PAR2), is elevated in a variety of cancers and has been promoted as a potential therapeutic target. With the development of potent antagonists for this receptor, we hypothesised that they could be used to treat renal cell carcinoma (RCC). The expression of PAR2 was, therefore, examined in human RCC tissues and selected RCC cell lines. Histologically confirmed cases of RCC, together with paired non-involved kidney tissue, were used to produce a tissue microarray (TMA) and to extract total tissue RNA. Immunohistochemistry and qPCR were then used to assess PAR2 expression. In culture, RCC cell lines versus primary human kidney tubular epithelial cells (HTEC) were used to assess PAR2 expression by qPCR, immunocytochemistry and an intracellular calcium mobilization assay. The TMA revealed an 85% decrease in PAR2 expression in tumour tissue compared with normal kidney tissue. Likewise, qPCR showed a striking reduction in PAR2 mRNA in RCC compared with normal kidney. All RCC cell lines showed lower levels of PAR2 expression than HTEC. In conclusion, we found that PAR2 was reduced in RCC compared with normal kidney and is unlikely to be a target of interest in the treatment of this type of cancer.
Publisher: John Wiley & Sons, Ltd
Date: 18-10-2004
Publisher: Informa UK Limited
Date: 11-01-2017
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 04-2011
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 03-2000
DOI: 10.1097/00007890-200003150-00020
Abstract: Patients over age 60 constitute half of all new patients accepted into the renal replacement therapy programs in Australia. However, the optimal treatment of their end-stage renal disease remains controversial. The aim of the present study was to compare survival for dialysis and renal transplantation in older patients who were rigorously screened and considered eligible for transplantation. The study cohort consisted of 174 consecutive patients over 60 who were accepted on to the Queensland cadaveric renal transplant waiting list between January 1, 1993 and December 31, 1997. Follow-up was terminated on October 1, 1998. Data were analyzed on an intention-to-transplant basis using a Cox regression model with time-varying explanatory variables. An alternative survival analysis was also performed, in which patients no longer considered suitable for transplantation were censored at the time of their removal from the waiting list. There were 67 patients receiving a renal transplant, whereas the other 107 continued to undergo dialysis. These two groups were well matched at baseline with respect to age, gender, body mass index, renal disease etiology, comorbid illnesses, and dialysis duration and modality. The overall mortality rate was 0.096 per patient-year (0.131 for dialysis and 0.029 for transplant, P<0.001). Respective 1-, 3- and 5-year survivals were 92%, 62%, and 27% for the dialysis group and 98%, 95%, and 90% (P<0.01) for the transplant group. Patients in the transplant group had an adjusted hazard ratio 0.16 times that of the dialysis group (95% confidence interval 0.06-0.42). If patients were censored at the time of their withdrawal from the transplant waiting list, the adjusted hazard ratio was 0.24 (95% confidence interval 0.09-0.69). Renal transplantation seems to confer a substantial survival advantage over dialysis in patients with end-stage renal failure who are rigorously screened and considered suitable for renal transplantation.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 09-2018
Publisher: MDPI AG
Date: 04-06-2026
Abstract: The incidence of infectious complications, compared with the general population and the pre-transplant status of the recipient, increases substantially following kidney transplantation, causing significant morbidity and mortality. The potent immunosuppressive therapy given to prevent graft rejection in kidney transplant recipients results in an increased susceptibility to a wide range of opportunistic infections including bacterial, viral and fungal infections. Over the last five years, several advances have occurred that may have changed the burden of infectious complications in kidney transplant recipients. Due to the availability of direct-acting antivirals to manage donor-derived hepatitis C infection, this has opened the way for donors with hepatitis C infection to be considered in the donation process. In addition, there have been the development of medications targeting the growing burden of resistant cytomegalovirus, as well as the discovery of the potentially important role of the gastrointestinal microbiota in the pathogenesis of post-transplant infection. In this narrative review, we will discuss these three advances and their potential implications for clinical practice.
Publisher: Wiley
Date: 24-07-2003
DOI: 10.1046/J.1440-1797.2003.00162.X
Abstract: Fibrogenic stresses promote progression of renal tubulointerstitial fibrosis, disparately affecting survival, proliferation and trans-differentiation of intrinsic renal cell populations through ill-defined biomolecular pathways. We investigated the effect of fibrogenic stresses on the activation of cell-specific mitogen-activated protein kinase (MAPK) in renal fibroblast, epithelial and endothelial cell populations. The relative outcomes (cell death, proliferation, trans-differentiation) associated with activation or inhibition of extracellular-regulated protein kinase (ERK) or stress activated/c-Jun N terminal kinase (JNK) were analysed in each renal cell population after challenge with oxidative stress (1 mmol/L H2O2), transforming growth factor-beta1 (TGF-beta1, 10 ng/mL) or tumour necrosis factor-alpha (TNF-alpha, 50 ng/mL) over 0-20 h. Apoptosis increased significantly in all cell types after oxidative stress (P < 0.05). In fibroblasts, oxidative stress caused the activation of ERK (pERK) but not JNK (pJNK). Inhibition of ERK by PD98059 supported its role in a fibroblast death pathway. In epithelial and endothelial cells, oxidative stress-induced apoptosis was preceded by early induction of pERK, but its inhibition did not support a pro-apoptotic role. Early ERK activity may be conducive to their survival or promote the trans-differentiation of epithelial cells. In epithelial and endothelial cells, oxidative stress induced pJNK acutely. Pretreatment with SP600125 (JNK inhibitor) verified its pro-apoptotic activity only in epithelial cells. Transforming growth factor-beta1 did not significantly alter mitosis or apoptosis in any of the cell types, nor did it alter MAPK activity. Tumor necrosis factor-alpha caused increased apoptosis with no associated change in MAPK activity. Our results demonstrate renal cell-specific differences in the activation of ERK and JNK following fibrotic insult, which may be useful for targeting excessive fibroblast proliferation in chronic fibrosis.
Publisher: Oxford University Press (OUP)
Date: 03-03-2014
DOI: 10.1093/NDT/GFU050
Abstract: There has not been a comprehensive examination to date of peritoneal dialysis (PD) outcomes after temporary haemodialysis (HD) transfer for peritonitis. The study included all incident Australian patients who experienced peritonitis between 1 October 2003, and 31 December 2011, using Australia and New Zealand Dialysis and Transplant Registry data. Patients were grouped into three categories: Interim HD, Permanent HD and Never HD based on HD transfer status after the first peritonitis. The independent predictors of HD transfer and subsequent return to PD were determined by multivariable, multilevel mixed-effects logistic regression analysis. Matched case-control analyses were performed to compare clinical outcomes (e.g. patient survival) between groups. Of the 3305 patients who experienced peritonitis during the study period, 553 episodes (16.7%) resulted in transfer to HD and 101 patients subsequently returned to PD. HD transfer was significantly and independently predicted by inpatient treatment of peritonitis [odds ratio (OR) 11.45, 95% confidence interval (CI) 7.14-18.36] and the recovered microbiologic profile of organisms recognized to be associated with moderate (20-40%) to high (>40%) rates of catheter removal (moderate: OR 2.45, 95% CI 1.89-3.17 high: OR 8.63, 95% CI 6.44-11.57). Matched case-control analyses yielded comparable results among Interim, Permanent and Never HD groups in terms of patient survival (P = 0.28), death-censored technique survival [hazard ratio (HR) 0.87, 95% CI 0.59-1.28 P = 0.48] and peritonitis-free survival (HR 0.84, 95% CI 0.50-1.39, P = 0.49). In an observational registry study of first peritonitis episodes, temporary HD transfer was not associated with inferior patient-level clinical outcomes when compared with others who either never required HD transfer or remained on HD permanently if all patient-level and peritonitis-related factors were considered equal. Therefore, return to PD after a temporary HD due to peritonitis should not be discouraged in appropriate PD patients.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 25-04-2018
Abstract: Background Mediterranean and Dietary Approaches to Stop Hypertension (DASH) diets associate with lower cardiovascular and all-cause mortality in the general population, but the benefits for patients on hemodialysis are uncertain. Methods Mediterranean and DASH diet scores were derived from the GA 2 LEN Food Frequency Questionnaire within the DIET-HD Study, a multinational cohort study of 9757 adults on hemodialysis. We conducted adjusted Cox regression analyses clustered by country to evaluate the association between diet score tertiles and all-cause and cardiovascular mortality (the lowest tertile was the reference category). Results During the median 2.7-year follow-up, 2087 deaths (829 cardiovascular deaths) occurred. The adjusted hazard ratios (95% confidence intervals) for the middle and highest Mediterranean diet score tertiles were 1.20 (1.01 to 1.41) and 1.14 (0.90 to 1.43), respectively, for cardiovascular mortality and 1.10 (0.99 to 1.22) and 1.01 (0.88 to 1.17), respectively, for all-cause mortality. Corresponding estimates for the same DASH diet score tertiles were 1.01 (0.85 to 1.21) and 1.19 (0.99 to 1.43), respectively, for cardiovascular mortality and 1.03 (0.92 to 1.15) and 1.00 (0.89 to 1.12), respectively, for all-cause mortality. The association between DASH diet score and all-cause death was modified by age ( P =0.03) adjusted hazard ratios for the middle and highest DASH diet score tertiles were 1.02 (0.81 to 1.29) and 0.70 (0.53 to 0.94), respectively, for younger patients (≤60 years old) and 1.05 (0.93 to 1.19) and 1.08 (0.95 to 1.23), respectively, for older patients. Conclusions Mediterranean and DASH diets did not associate with cardiovascular or total mortality in hemodialysis.
Publisher: American College of Physicians
Date: 06-07-2010
DOI: 10.7326/0003-4819-153-1-201007060-00252
Abstract: Previous meta-analyses suggest that treatment with erythropoiesis-stimulating agents (ESAs) in chronic kidney disease (CKD) increases the risk for death. Additional randomized trials have been recently completed. To summarize the effects of ESA treatment on clinical outcomes in patients with anemia and CKD. MEDLINE (January 1966 to November 2009), EMBASE (January 1980 to November 2009), and the Cochrane database (to March 2010) were searched without language restriction. Two authors independently screened reports to identify randomized trials evaluating ESA treatment in people with CKD. Hemoglobin target trials or trials of ESA versus no treatment or placebo were included. Two authors independently extracted data on patient characteristics, study risks for bias, and the effects of ESA therapy. 27 trials (10 452 patients) were identified. A higher hemoglobin target was associated with increased risks for stroke (relative risk [RR], 1.51 [95% CI, 1.03 to 2.21]), hypertension (RR, 1.67 [CI, 1.31 to 2.12]), and vascular access thrombosis (RR, 1.33 [CI, 1.16 to 1.53]) compared with a lower hemoglobin target. No statistically significant differences in the risks for mortality (RR, 1.09 [CI, 0.99 to 1.20]), serious cardiovascular events (RR, 1.15 [CI, 0.98 to 1.33]), or end-stage kidney disease (RR, 1.08 [CI, 0.97 to 1.20]) were observed, although point estimates favored a lower hemoglobin target. Treatment effects were consistent across subgroups, including all stages of CKD. The evidence for effects on quality of life was limited by selective reporting. Trials also reported insufficient information to allow analysis of the independent effects of ESA dose on clinical outcomes. Targeting higher hemoglobin levels in CKD increases risks for stroke, hypertension, and vascular access thrombosis and probably increases risks for death, serious cardiovascular events, and end-stage renal disease. The mechanisms for harm remain unclear, and meta-analysis of in idual-patient data and trials on fixed ESA doses are recommended to elucidate these mechanisms. None.
Publisher: Wiley
Date: 20-01-2005
Publisher: Elsevier BV
Date: 05-2021
Publisher: Elsevier BV
Date: 07-2009
DOI: 10.1053/J.AJKD.2009.03.010
Abstract: Primary hepatitis B virus (HBV) vaccination through the intramuscular (IM) route is less efficacious in dialysis patients than in the general population. Previous studies suggest improved seroconversion with intradermal (ID) vaccination. Prospective open-label randomized controlled trial. Hemodialysis patients nonresponsive to primary HBV vaccination. Revaccination with either ID (10 microg of vaccine every week for 8 weeks) [DOSAGE ERROR CORRECTED] or IM (40 microg of vaccine at weeks 1 and 8) HBV vaccine . proportion of patients achieving HBV surface antibody (anti-HBs) titer of 10 IU/L or greater within 2 months of vaccination course. time to seroconversion, predictors of seroconversion, peak antibody titer, duration of seroprotection, and safety and tolerability of vaccine. Anti-HBs titer to 24 months. 59 patients were analyzed. Seroconversion rates were 79% ID versus 40% IM (P = 0.002). The unadjusted odds ratio for seroconversion for ID versus IM was 5.5 (95% confidence interval [CI], 1.6 to 18.4) and increased with adjustment for baseline differences. The only factor predictive of seroconversion was the ID vaccination route. The geometric mean peak antibody titer was significantly greater in the ID versus IM group: 239 IU/L (95% CI, 131 to 434) versus 78 IU/L (95% CI, 36 to 168 P < 0.001). There was a trend toward longer duration of seroprotection with ID vaccination. ID vaccine was safe and well tolerated. Inability to distinguish whether the mechanism of the greater efficacy of ID vaccination was the cumulative effect of multiple injections or route of administration use of anti-HBs as a surrogate marker of protection lack of evidence of long-term protection. Significantly greater seroconversion rates and peak antibody titers can be achieved with ID compared with IM vaccination in hemodialysis patients nonresponsive to primary vaccination. ID vaccination should become the standard of care in this setting.
Publisher: Wiley
Date: 02-2006
Publisher: Elsevier BV
Date: 2012
DOI: 10.1053/J.AJKD.2011.06.018
Abstract: Determinants and outcomes of peritoneal dialysis (PD)-associated peritonitis occurring within 4 weeks of completion of therapy of a prior episode caused by the same (relapse) or different organism (recurrence) recently have been characterized. However, determinants and outcomes of peritonitis occurring more than 4 weeks after treatment of a prior episode caused by the same (repeated) or different organism (nonrepeated) are poorly understood. Observational cohort study using Australia and New Zealand Dialysis and Transplant Registry (ANZDATA) data. All Australian PD patients between October 1, 2003, and December 31, 2007, with first episodes of repeated or nonrepeated peritonitis. Repeated versus nonrepeated peritonitis, according to International Society of PD (ISPD) criteria. Relapse, hospitalization, catheter removal, hemodialysis transfer, and death. After a peritonitis episode, the probability that a subsequent episode represented repeated rather than nonrepeated peritonitis was highest in the second month (41%), then progressively decreased to a stable level of 14% from 6 months onward. When first episodes of repeated (n = 245) or nonrepeated peritonitis (n = 824) were analyzed, repeated peritonitis was predicted independently by a shorter elapsed time from the prior episode (adjusted OR per day elapsed, 0.91 95% CI, 0.88-0.94). Staphylococcus aureus and coagulase-negative staphylococcus were isolated more frequently in repeated peritonitis, whereas Gram-negative, streptococcal, and fungal organisms were recovered more frequently in nonrepeated peritonitis. Using multivariate logistic regression, repeated peritonitis was associated independently with higher relapse (OR, 5.41 95% CI, 3.72-7.89) and lower hospitalization rates (OR, 0.63 95% CI, 0.46-0.85), but catheter removal, hemodialysis transfer, and death rates similar to nonrepeated peritonitis. Limited covariate adjustment. Residual confounding and coding bias could not be excluded. Repeated and nonrepeated peritonitis episodes are caused by different spectra of micro-organisms and have different outcomes. Study findings suggest that the ISPD definition for repeated peritonitis should be limited to 6 months.
Publisher: Wiley
Date: 10-09-2008
DOI: 10.1111/J.1399-3062.2007.00273.X
Abstract: Cytomegalovirus (CMV) is an important and well-described opportunistic virus in the immunocompromised host, with infection occurring mainly after the first month in the new renal transplant recipient. CMV can present as primary infection, reinfection, or reactivation of latent disease. It is capable of protean manifestations. Cutaneous manifestations are variable, rare, and diagnosis often delayed. We present 3 cases of cutaneous CMV disease in renal transplant recipients. Manifestations in our patients included ulceration of the tongue and perianal areas, facial petechiae, and nodular lesion involving the ear. This case series serves to highlight the importance of early skin biopsy in the diagnosis and management of cutaneous CMV disease.
Publisher: S. Karger AG
Date: 06-2013
DOI: 10.1159/000350726
Abstract: b i Background: /i /b The incidence and cost of chronic kidney disease (CKD) are increasing. Renal tubular epithelial cell dysfunction and attrition, involving increased apoptosis and cell senescence, are central to the pathogenesis of CKD. The aim here was to use an in vitro model to investigate the separate and cumulative effects of oxidative stress, mitochondrial dysfunction and cell senescence in promoting loss of renal mass. b i Methods: /i /b Human kidney tubular epithelial cells (HK2) were treated with moderate hydrogen peroxide (H sub /sub O sub /sub ) for oxidative stress, with or without cell cycle inhibition (apigenin, API) for cell senescence. Adenosine triphosphate (ATP) and oxidative stress were measured by ATP assay, lipid peroxidation, total antioxidant capacity, mitochondrial function with confocal microscopy, MitoTracker Red CMXRos and live cell imaging with JC-1. In parallel, cell death and injury (i.e. apoptosis and Bax/Bcl-X sub L /sub expression, lactate dehydrogenase), cell senescence (SA-β-galactosidase) and renal regenerative ability (cell proliferation), and their modulation with the anti-oxidant N-acetyl-cysteine (NAC) were investigated. b i Results: /i /b H sub /sub O sub /sub and API, separately, increased oxidative stress and mitochondrial dysfunction, apoptosis and cell senescence. Although API caused cell senescence, it also induced oxidative stress at levels similar to H sub /sub O sub /sub treatment alone, indicating that senescence and oxidative stress may be intrinsically linked. When H sub /sub O sub /sub and API were delivered concurrently, their detrimental effects on renal cell loss were compounded. The antioxidant NAC attenuated apoptosis and senescence, and restored regenerative potential to the kidney. b i Conclusion: /i /b Oxidative stress and cell senescence both cause mitochondrial destabilization and cell loss and contribute to the development of the cellular characteristics of CKD.
Publisher: Springer Science and Business Media LLC
Date: 07-09-2021
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 11-2006
DOI: 10.2215/CJN.00210106
Publisher: Public Library of Science (PLoS)
Date: 26-03-2019
Publisher: SAGE Publications
Date: 09-2017
Abstract: The HONEYPOT trial failed to establish the superiority of exit-site application of Medihoney compared with nasal mupirocin prophylaxis for the prevention of peritonitis in peritoneal dialysis (PD) patients. This study aimed to assess the representativeness of the patients in the HONEYPOT trial to the Australian and New Zealand PD population. This study compared baseline characteristics of the 371 PD patients in the HONEYPOT trial with those of 6,085 PD patients recorded on the Australia and New Zealand Dialysis and Transplant (ANZDATA) Registry. Compared with the PD population, the HONEYPOT s le was older (standardized difference [ d] = 0.19, p = 0.003), more likely to be treated with automated PD ( d = 0.58, p 0.001), had higher residual renal function ( d = 0.26, p 0.001) and a higher proportion of participants with end-stage kidney disease due to polycystic kidney disease ( d = 0.17) and lower proportion due to diabetes ( d = -0.17) and glomerulonephritis ( d = -0.18) ( p 0.001), and lower proportions of indigenous people ( d = -0.17, p 0.001), current smokers ( d = -0.10, p 0.001), and people with prior histories of hemodialysis ( d = -0.16, p 0.001), diabetes mellitus ( d = -0.18, p 0.001), and coronary artery disease ( d = -0.15, p 0.001). HONEYPOT trial participants tended to be healthier than the Australian and New Zealand PD patient population. Although the differences between the groups were generally modest, it is possible that their cumulative effect may have had some impact on external generalizability, which is not an uncommon occurrence in clinical trials.
Publisher: S. Karger AG
Date: 12-12-2016
DOI: 10.1159/000450690
Abstract: Glucose-based peritoneal dialysis (PD) solutions are the mainstay of therapy for PD patients, yet are accompanied by a number of adverse effects and potential complications. The high glucose content can cause both systemic effects, such as hyperglycaemia, as well as local effects on the peritoneal membrane, which can interfere with its function. In addition, glucose degradation products (GDPs) generated during heat sterilization of the solutions and the acidic pH at which these solutions are kept have been shown to cause peritoneal membrane injury and precipitate inflow pain, respectively. As a result, biocompatible PD solutions, characterized by neutral pH and low GDP concentrations, have been developed. However, the published evidence supporting their use has often been conflicting and of variable methodological quality. This review aims to discuss the relevant literature and up-to-date evidence for the use of biocompatible PD solutions.
Publisher: Elsevier BV
Date: 02-2005
DOI: 10.1111/J.1523-1755.2005.67135.X
Abstract: Higher total white blood cell counts (WCC) have been shown in the general population to be strongly and independently predictive of coronary heart disease and all-cause mortality. The aim of the present study was to evaluate the prognostic value of WCC in patients commencing peritoneal dialysis (PD). A cohort of 323 patients (mean age 55.1 +/- 17.7 years, 54% male, 81% Caucasian) commencing PD at the Princess Alexandra Hospital between January 1, 1998 and March 31, 2003 were prospectively followed until death, completion of PD therapy, or otherwise to the end of the study (January 2, 2004), at which point data were censored. In iduals with failed renal transplants (N= 17) and those with acute infections at the time of PD onset (N= 12) were not included. A multivariate Cox's proportional hazards model was applied to calculate hazard ratios and adjusted survival curves for time to death or cardiac death, adjusting for baseline demographic, clinical, and laboratory characteristics. Median actuarial patient survival was 3.9 years [95% confidence interval (CI) 3.2-4.7 years]. The highest quartile of WCC (>9.4 x 10(9)/L) was significantly and independently associated with increased risks of both death from all causes [adjusted hazard ratio (HR) 2.27, 95% CI 1.09-4.74, P < 0.05] and cardiac death (HR 3.75, 95% CI 1.2-11.8, P < 0.05). Other adverse risk factors included older age, lower serum albumin, and the presence of coronary artery disease. Similar associations were found between mortality and PMN count, but not lymphocyte count. Elevated baseline WCC or PMN count at the commencement of PD (in the absence of acute infection) strongly predicts all-cause and cardiovascular mortality. These data suggest that new PD patients with higher WCC may warrant closer monitoring and extra attention to modifiable cardiovascular risk factors.
Publisher: FapUNIFESP (SciELO)
Date: 12-2015
Publisher: SAGE Publications
Date: 09-2012
Abstract: Management of peritoneal dialysis (PD)–associated peritonitis requires timely intervention by experienced staff, which may not be uniformly available throughout the week. The aim of the present study was to examine the effects of weekend compared with weekday presentation on peritonitis outcomes. The study, which used data from the Australia and New Zealand Dialysis and Transplant Registry, included all Australian patients receiving PD between 1 October 2003 and 31 December 2008. The independent predictors of weekend presentation and subsequent peritonitis outcomes were assessed by multivariate logistic regression. Peritonitis presentation rates were significantly lower on Saturdays [0.46 episodes per year 95% confidence interval (CI): 0.42 to 0.49 episodes per year] and on Sundays (0.43 episodes per year 95% CI: 0.40 to 0.47 episodes per year) than all other weekdays they peaked on Mondays (0.76 episodes per year 95% CI: 0.72 to 0.81 episodes per year). Weekend presentation with a first episode of peritonitis was independently associated with lower body mass index and residence less than 100 km away from the nearest PD unit. Patients presenting with peritonitis on the weekend were significantly more likely to be hospitalized [adjusted odds ratio (OR): 2.32 95% CI: 1.85 to 2.90], although microbial profiles and empiric antimicrobial treatments were comparable between the weekend and weekday groups. Antimicrobial cure rates were also comparable (79% vs 79%, p = 0.9), with the exception of cure rates for culture-negative peritonitis, which were lower on the weekend (80% vs 88%, p = 0.047). Antifungal prophylaxis was less likely to be co-prescribed for first peritonitis episodes presenting on weekdays (OR: 0.68 95% CI: 0.05 to 0.89). Patients on PD are less likely to present with peritonitis on the weekend. Nevertheless, the microbiology, treatment, and outcomes of weekend and weekday PD peritonitis presentations are remarkably similar. Exceptions include the associations of weekend presentation with a higher hospitalization rate and a lower cure rate in culture-negative infection.
Publisher: Public Library of Science (PLoS)
Date: 25-02-2014
Publisher: Oxford University Press (OUP)
Date: 07-07-2007
DOI: 10.1093/NDT/GFM324
Abstract: High transporter status is associated with reduced survival of patients receiving peritoneal dialysis (PD). This may be due primarily to the development of complications related to the PD process, in which case the survival disadvantage may not persist following transfer to haemodialysis (HD). In this study, we aimed to assess the impact of peritoneal membrane transporter status on patient survival and the likelihood of return to PD following transfer from PD to HD. The Australia and New Zealand Dialysis and Transplant (ANZDATA) Registry was searched to identify all patients between 1 April 1999 and 31 March 2004 who had received PD and subsequently transferred to HD, in whom an incident 4 h dialysate: plasma creatinine ratio was recorded. A Cox proportional hazards model was used to identify factors significantly associated with patient and technique survival after commencement of HD. A total of 918 patients were included in the analysis. On multivariate Cox regression analysis there was no difference in survival between transport groups relative to the reference group of low average transporters (adjusted hazard ratio (HR) 0.71, 95% CI 0.42-1.19, P = 0.19, HR 0.94, 95% CI 0.63-1.38, P = 0.73 and HR 0.24, 95% CI 0.06-1.01, P = 0.051 for high, high average and low transporter groups, respectively). Significant predictors of mortality were duration of PD more than 22 months (HR 2.32, 95% CI 1.24-4.33, P = 0.01), increasing age, late referral to a nephrologist and a history of diabetes mellitus. The likelihood of returning to PD was increased if initial PD technique failure was due to mechanical complications compared with all other causes of failure [HR 3.65 (95% CI 2.78-4.79) P < 0.001] and decreased with higher body mass index [HR 0.97 per kg/m(2) (95% CI 0.94-0.99), P = 0.01] and the 4 h dialysate: plasma creatinine ratio considered as a continuous variable [4 h D:P Cr HR 0.32 per unit (95% CI 0.12-0.89), P = 0.03]. The survival disadvantage associated with high peritoneal membrane transport status during PD treatment does not persist following transfer to HD. Early transfer to HD may be beneficial in this patient group.
Publisher: Elsevier BV
Date: 05-2010
DOI: 10.1038/KI.2010.16
Abstract: Encapsulating peritoneal sclerosis is a complication of peritoneal dialysis characterized by persistent, intermittent, or recurrent adhesive bowel obstruction. Here we examined the incidence, predictors, and outcomes of encapsulating peritoneal sclerosis (peritoneal fibrosis) by multivariate logistic regression in incident peritoneal dialysis patients in Australia and New Zealand. Matched case-control analysis compared the survival of patients with controls equivalent for age, gender, diabetes, and time on peritoneal dialysis. Of 7618 patients measured over a 13-year period, encapsulating peritoneal sclerosis was diagnosed in 33, giving an incidence rate of 1.8/1000 patient-years. The respective cumulative incidences of peritoneal sclerosis at 3, 5, and 8 years were 0.3, 0.8, and 3.9%. This condition was independently predicted by younger age and the duration of peritoneal dialysis, but not the rate of peritonitis. Twenty-six patients were diagnosed while still on peritoneal dialysis. Median survival following diagnosis was 4 years and not statistically different from that of 132 matched controls. Of the 18 patients who died, only 7 were attributed directly to peritoneal sclerosis. Our study shows that encapsulating peritoneal sclerosis is a rare condition, predicted by younger age and the duration of peritoneal dialysis. The risk of death is relatively low and not appreciably different from that of competing risks for mortality in matched dialysis control patients.
Publisher: Oxford University Press (OUP)
Date: 22-11-2018
DOI: 10.1093/NDT/GFX314
Abstract: Mounting evidence indicates an increased risk of cognitive impairment in adults with end-stage kidney disease on dialysis, but the extent and pattern of deficits across the spectrum of cognitive domains are uncertain. We conducted a cross-sectional study of 676 adult hemodialysis patients from 20 centers in Italy, aiming to evaluate the prevalence and patterns of cognitive impairment across five domains of learning and memory, complex attention, executive function, language and perceptual-motor function. We assessed cognitive function using a neuropsychological battery of 10 tests and calculated test and domain z-scores using population norms (age or age/education). We defined cognitive impairment as a z-score ≤ -1.5. Participants' median age was 70.9 years (range 21.6-94.1) and 262 (38.8%) were women. Proportions of impairment on each domain were as follows: perceptual-motor function 31.5% (150/476), language 41.2% (273/662), executive function 41.7% (281/674), learning and memory 42.2% (269/638), complex attention 48.8% (329/674). Among 474 participants with data for all domains, only 28.9% (n = 137) were not impaired on any domain, with 25.9% impaired on a single domain (n = 123), 17.3% on two (n = 82), 13.9% on three (n = 66), 9.1% on four (n = 43) and 4.9% (n = 23) on all five. Across patients, patterns of impairment combinations were erse. In conclusion, cognitive impairment is extremely common in hemodialysis patients, across numerous domains, and patients often experience multiple deficits simultaneously. Clinical care should be tailored to meet the needs of patients with different types of cognitive impairment and future research should focus on identifying risk factors for cognitive decline.
Publisher: Elsevier BV
Date: 11-2006
DOI: 10.1053/J.AJKD.2006.08.010
Abstract: Current clinical practice guidelines recommend that no particular type of peritoneal dialysis (PD) catheter has been proved superior to another. However, a recent Cochrane review recommended the need for a large, well-designed, randomized, controlled trial of straight versus coiled PD catheters because of the paucity and suboptimal quality of previously performed trials. A randomized controlled trial was undertaken at 2 metropolitan teaching hospitals comparing the effects of straight versus coiled PD catheters on time to catheter malposition (primary outcome), catheter-associated infection, technique failure, and all-cause mortality. One hundred thirty-two PD patients were enrolled and randomly assigned to insertion of a coiled (n = 62) or straight catheter (n = 70). There was no significant difference in time to laparoscopic reposition between the 2 cohorts (log-rank score, 0.41 P = 0.52). However, median technique survival was significantly worse for coiled catheters (1.5 years 95% confidence interval [CI], 1.2 to 1.8) compared with straight catheters (2.1 years 95% CI, 1.8 to 2.5 P < 0.05), primarily because of increased risk for inadequate dialytic clearance with the former. On univariate Cox proportional hazards model analysis, insertion of a coiled PD catheter was associated significantly with a greater risk for technique failure (unadjusted hazard ratio, 1.86 95% CI, 1.03 to 3.36). No difference was observed between the 2 groups with respect to catheter-associated infections or overall patient survival. Coiled catheters do not influence the risk for drainage failure caused by catheter malposition compared with straight catheters, but are associated with significantly increased risk for PD technique failure, primarily because of inadequate dialytic clearance.
Publisher: Wiley
Date: 25-04-2011
DOI: 10.1111/J.1440-1797.2011.01450.X
Abstract: Hypovitaminosis D is a significant health-care burden worldwide, particularly in susceptible populations such as those with chronic kidney disease (CKD). Recent epidemiological studies have identified that both higher serum vitamin D concentrations and use of vitamin D supplements may confer a survival benefit both in terms of all-cause and cardiovascular mortality. There is potential to investigate this inexpensive therapy for the CKD population, which suffers excessive cardiovascular events, although the mechanisms explaining this link have yet to be fully elucidated. This review discusses potential mechanisms identified in the basic science literature that may provide important insights into how vitamin D may orchestrate a change in cardiovascular risk profile through such erse mechanisms as inflammation, atherogenesis, glucose homeostasis, vascular calcification, renin-angiotensin regulation and alterations in cardiac physiology. Where available, the clinical translation of these concepts to intervention trials in the CKD population will be reviewed.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 02-2003
Publisher: Elsevier BV
Date: 05-01-2004
DOI: 10.1016/J.JCHROMB.2003.10.033
Abstract: The immunosuppressant drug mycophenolic acid (MPA) and its major metabolite, mycophenolic acid glucuronide (MPAG), are highly bound to albumin. An HPLC-tandem-MS (HPLC/MS/MS) and an HPLC-UV assay were developed to measure free (unbound) concentrations of MPA and MPAG, respectively. Ultrafiltrate was prepared from plasma (500 microl) by ultrafiltration at 3000 x g for 20 min (20 degrees C). Both MPA and MPAG were isolated from ultrafiltrate (100 microl) by acidification and C18 solid-phase extraction. Free MPA was measured by electrospray tandem mass spectrometry using selected reactant monitoring (MPA: m/z 338.2--> 206.9) in positive ionisation mode. Chromatography was performed on a PFPP column (50 mm x 2 mm, 5 microm). Total analysis time was 7 min. The assay was linear over the range 1-200 microg/l with a limit of quantification of 1 microg/l. The inter-day accuracy and imprecision of quality controls (7.5, 40, 150 microg/l) were 94-99% and < 7%, respectively. Free MPAG was chromatographed on a C18 Nova-Pak column (150 mm x 3.9 mm, 5 microm) using a binary gradient over 20 min. The eluent was monitored at 254 nm. The assay was linear over the range 1-50 mg/l with the limit of quantification at 2.5 mg/l. The inter-day accuracy and imprecision of quality controls (5, 20, 45 mg/l) was 101-107% and < 8% (n = 4), respectively. For both methods no interfering substances were found in ultrafiltrate from patients not receiving MPA. The methods described have a suitable dynamic linear range to facilitate the investigation of free MPA and MPAG pharmacokinetics in transplant patients. Further, this is the first reported HPLC-UV method to determine free MPAG concentrations.
Publisher: Elsevier BV
Date: 05-2006
DOI: 10.1053/J.AJKD.2006.01.014
Abstract: Previous small uncontrolled studies suggested that fludrocortisone may significantly decrease serum potassium concentrations in hemodialysis patients, possibly through enhancement of colonic potassium secretion. The aim of this study is to evaluate the effect of oral fludrocortisone on serum potassium concentrations in hyperkalemic hemodialysis patients in an open-label randomized controlled trial. Thirty-seven hemodialysis patients with predialysis hyperkalemia were randomly allocated to administration of either oral fludrocortisone (0.1 mg/d n = 18) or no treatment (control n = 19) for 3 months. The primary outcome measure was midweek predialysis serum potassium concentration, which was measured monthly during the trial. Prospective power calculations indicated that the study had an 80% probability of detecting a decrease in serum potassium levels of 0.7 mEq/L (0.7 mmol/L). Baseline patient characteristics were similar, except for slightly longer total weekly dialysis hours in the fludrocortisone group (13.0 +/- 1.3 versus 12.1 +/- 1.0 P = 0.02). At the end of the study period, no significant changes in serum potassium concentrations were observed between the fludrocortisone and control groups (4.8 +/- 0.5 versus 5.2 +/- 0.7 mEq/L [mmol/L], respectively P = 0.10). Similar results were obtained when changes in serum potassium levels over time were examined between the 2 arms by using repeated-measures analysis of variance, with or without adjustment for total weekly dialysis hours. Secondary outcomes, including predialysis mean arterial pressure, interdialytic weight gain, serum sodium level, and hospitalization for hyperkalemia, were not significantly different between groups. There were no observed adverse events. Administering fludrocortisone to hyperkalemic hemodialysis patients is safe and well tolerated, but does not achieve clinically important decreases in serum potassium levels.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 08-2015
DOI: 10.2215/CJN.00840115
Publisher: Elsevier BV
Date: 09-2014
DOI: 10.1016/J.NUMECD.2014.04.006
Abstract: There is a growing body of evidence supporting the nephrovascular toxicity of indoxyl sulphate (IS) and p-cresyl sulphate (PCS). Nonetheless, a comprehensive description of how these toxins accumulate over the course of chronic kidney disease (CKD) is lacking. This cross-sectional observational study included a convenience s le of 327 participants with kidney function categorised as normal, non-dialysis CKD and end-stage kidney disease (ESKD). Participants underwent measurements of serum total and free IS and PCS and assessment of cardiovascular history and structure (carotid intima-media thickness [cIMT, a measure of arterial stiffness]), and endothelial function (brachial artery reactivity [flow-mediated dilation (BAR-FMD) glyceryl trinitrate (BAR-GTN)]). Across the CKD spectrum there was a significant increase in both total and free IS and PCS and their free fractions, with the highest levels observed in the ESKD population. Within each CKD stage, concentrations of PCS, total and free, were significantly greater than IS (all p < 0.01). Both IS and PCS, free and total, were correlated with BAR-GTN (ranging from r = -0.33 to -0.44) and cIMT (r = 0.19 to 0.21), even after adjusting for traditional risk factors (all p < 0.01). Further, all toxins were independently associated with the presence of cardiovascular disease (all p < 0.02). More advanced stages of CKD are associated with progressive increases in total and free serum IS and PCS, as well as increases in their free fractions. Total and free serum IS and PCS were independently associated with structural and functional markers of cardiovascular disease. Studies of therapeutic interventions targeting these uraemic toxins are warranted.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 15-08-2003
Publisher: Elsevier BV
Date: 09-2002
Abstract: Care of patients with end-stage renal disease (ESRD) is important and resource intense. To enable ESRD programs to develop strategies for more cost-efficient care, an accurate estimate of the cost of caring for patients with ESRD is needed. The objective of our study is to develop an updated and accurate itemized description of costs and resources required to treat patients with ESRD on dialysis therapy and contrast differences in resources required for various dialysis modalities. One hundred sixty-six patients who had been on dialysis therapy for longer than 6 months and agreed to enrollment were followed up prospectively for 1 year. Detailed information on baseline patient characteristics, including comorbidity, was collected. Costs considered included those related to outpatient dialysis care, inpatient care, outpatient nondialysis care, and physician claims. We also estimated separately the cost of maintaining the dialysis access. Overall annual cost of care for in-center, satellite, and home/self-care hemodialysis and peritoneal dialysis were US $51,252 (95% confidence interval [CI], 47,680 to 54,824), $42,057 (95% CI, 39,523 to 44,592), $29,961 (95% CI, 21,252 to 38,670), and $26,959 (95% CI, 23,500 to 30,416), respectively (P < 0.001). After adjustment for the effect of other important predictors of cost, such as comorbidity, these differences persisted. Among patients treated with hemodialysis, the cost of vascular access-related care was lower by more than fivefold for patients who began the study period with a functioning native arteriovenous fistula compared with those treated with a permanent catheter or synthetic graft (P < 0.001). To maximize the efficiency with which care is provided to patients with ESRD, dialysis programs should encourage the use of home/self-care hemodialysis and peritoneal dialysis.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 09-2011
Publisher: SAGE Publications
Date: 09-2011
Publisher: Wiley
Date: 02-2010
DOI: 10.1111/J.1440-1797.2009.01191.X
Abstract: Renal nurses in Australia and New Zealand are critical to the care of patients with chronic kidney disease (CKD), especially those on dialysis. We aimed to obtain the opinions of renal nurses in Australia and New Zealand on the Caring for Australasians with Renal Impairment (CARI) Guidelines. A self-administered survey was distributed to all members of the professional organisation for renal nurses (Renal Society of Australasia) in 2006. The results were compared with those from a similar survey in 2002 and an identical 2006 survey of Australian and New Zealand nephrologists. Of the 173 respondents, more than 95% considered the Guidelines to be a good synthesis of the available evidence, 80% indicated that the Guidelines had significantly influenced their practice and 86% considered that the Guidelines had improved patient outcomes. Older respondents were less likely to perceive that the Guidelines had improved patient outcomes, and renal nurse educators were more likely to consider that the Guidelines were based on the best available evidence than other respondents. Respondents were generally more positive about the Guidelines in 2006 than in 2002. Although nephrologists were generally positive about the CARI Guidelines, renal nurses were more positive, especially regarding the effect of the Guidelines on practice and the improvement in health outcomes. Australian and New Zealand renal nurses valued the CARI Guidelines highly, used them in practice and considered that they led to improved patient outcomes. Positive responses towards the Guidelines increased between 2002 and 2006.
Publisher: Elsevier BV
Date: 09-2008
DOI: 10.1016/J.FCT.2008.06.090
Abstract: Dioscorea villosa (wild yam) rhizome extract is a medicinal herb that is commonly used to treat symptoms of menopause and rheumatoid arthritis. We had evidence from previous in vitro experiments that this extract is toxic and pro-fibrotic in renal cells and aimed to test whether this occurs in vivo. Sprague-Dawley rats received 0.79g/kg/d D. villosa extract in their food or no treatment over 7, 14 and 28d (n=4 per group). Kidney and liver tissues were collected for protein extraction and Western immunoblots or fixed for special histologic stains, immunohistochemistry (IHC) and microscopy. Collagen deposition was assessed using Masson's trichrome staining and morphometry. Macrophage infiltration (ED-1), epithelial-to-mesenchymal transdifferentiation or activation of fibroblasts (vimentin, alpha-SMA), and pro-fibrotic growth factors (TGFss1, CTGF) were assessed using IHC. Protein expression levels of the pro-inflammatory cytokine, TNF-alpha, the pro-fibrotic transcription factor, NFkappaB, a measure of oxidative stress (heme oxygenase-1), alpha-SMA, vimentin and TGFss1 were determined. Results showed that kidneys of the treated animals had significantly increased collagen, vimentin, TGFbeta1, NFkappaB, EDI, CTGF and alpha-SMA by 28d. In the liver, there was increased ED-1 and TGFbeta1 in the centrilobular zone at 28d in treated animals. In conclusion, there was no acute reno- or hepato-toxicity associated with administration of D. villosa. However, there was an increase in fibrosis in the kidneys and in inflammation in livers of rats consuming D. villosa for 28 days. Long term supplementation with D. villosa may be best avoided, especially in people with compromised renal function and in those who need to take other drugs which may alter kidney function.
Publisher: Wiley
Date: 06-2019
DOI: 10.1111/IMJ.14168
Abstract: Perioperative medicine is rapidly emerging as a key discipline to address the specific needs of high-risk surgical groups, such as those on chronic dialysis. Crude hospital separation rates for chronic dialysis patients are considerably higher than patients with normal renal function, with up to 15% of admission being related to surgical intervention. Dialysis dependency carries substantial mortality and morbidity risk compared to patients with normal renal function. This group of patients has a high comorbid burden and complex medical need, making accurate perioperative planning essential. Existing perioperative risk assessment tools are unvalidated in chronic dialysis patients. Furthermore, they fail to incorporate important dialysis treatment-related characteristics that could potentially influence perioperative outcomes. There is a dearth of information on perioperative outcomes of Australasian dialysis patients. Current perioperative outcome estimates stem predominantly from North American literature however, the generalisability of these findings is limited, as the survival of North American dialysis patients is significantly inferior to their Australasian counterparts and potentially confounds reported perioperative outcomes let alone regional variation in surgical indication and technique. We propose that data linkage between high-quality national registries will provide more complete data with more detailed patient and procedural information to allow for more informative analyses to develop and validate dialysis-specific risk assessment tools.
Publisher: Wiley
Date: 06-12-2018
Publisher: Oxford University Press (OUP)
Date: 21-11-2019
DOI: 10.1093/CKJ/SFY113
Abstract: Urinary 20-hydroxyeicosatetraenoic acid (20-HETE) has been associated with hypertension in women with elevated urinary cadmium (Cd) excretion rates. The present study investigates the urinary Cd and 20-HETE levels in relation to the estimated glomerular filtration rate (eGFR) and albumin excretion in men and women. A population-based, cross-sectional study, which included 225 women and 84 men aged 33–55 years, was conducted in a rural area known to be polluted with Cd. In all subjects, lower eGFR values were associated with higher urinary Cd excretion (P = 0.030), and tubulopathy markers N-acetyl-β-d-glucosaminidase (P 0.001) and β2-microglobulin (β2-MG) (P 0.001). On average, the hypertensive subjects with the highest quartile of urinary Cd had eGFR values of 12 and 17 mL/min/1.73 m2 lower than that in the hypertensive (P = 0.009) and normotensive subjects (P 0.001) with the lowest quartile of urinary Cd, respectively. In men, urinary albumin was inversely associated with 20-HETE (β = −0.384, P 0.001), while showing a moderately positive association with systolic blood pressure (SBP) (β = 0.302, P = 0.037). In women, urinary albumin was not associated with 20-HETE (P = 0.776), but was associated with tubulopathy, reflected by elevated urinary excretion of β2-MG (β = 0.231, P = 0.002). Tubulopathy is a determinant of albumin excretion in women, while 20-HETE and SBP are determinants of urinary albumin excretion in men. Associations of chronic exposure to Cd with marked eGFR decline and renal tubular injury seen in both Cd-exposed men and women add to mounting research data that links Cd to the risk of developing chronic kidney disease.
Publisher: S. Karger AG
Date: 2015
DOI: 10.1159/000371554
Abstract: b i Background/Aims: /i /b Poor glycemic control can lead to increased morbidity and mortality in peritoneal dialysis (PD) patients. Serum fructosamine may be a more reliable marker of glycemic control than HbA1c in dialysis patients. b i Methods: /i /b We evaluated the effects of a glucose-sparing PD regimen on serum fructosamine. In the multicenter, controlled IMPENDIA trial, eligible diabetic PD patients were randomized (1:1) to a 24-hour combination of a glucose sparing regimen (n i = /i ) or a glucose-based therapy (n i = /i ). Serum fructosamine and HbA1c were measured at baseline, 3 months and 6 months fructosamine measurements were corrected for serum albumin (AlbF). b i Results: /i /b Serum fructosamine decreased from 297 to 253 µmol/l in the glucose-sparing group (95% confidence interval [CI] for the difference, −26 to −68, p 0.001), and increased from 311 to 314 µmol/l in the glucose-only group (95% CI for the difference, −23 to +19, p i = /i .87). The mean difference in change of fructosamine levels between groups at 6 months was 64 µmol/l (95% CI 29-99, p 0.001). HbA1c decreased versus baseline in both groups (treatment difference 0.3%, p i = /i .07). The correlation between AlbF and baseline fasting serum glucose was stronger than that seen between HbA1c and baseline fasting serum glucose (r = 0.47, p 0.0001 and r = 0.31, p 0.0001, respectively). b i Conclusion: /i /b A glucose-sparing regimen (P-E-N) improved glycemic control as measured by serum fructosamine. Further studies are needed to establish fructosamine targets that will reduce the morbidity risk related to hyperglycemia in PD patients.
Publisher: AIP Publishing
Date: 10-2006
DOI: 10.1063/1.2336433
Abstract: When used for the production of an x-ray imaging backlighter source on Sandia National Laboratories’ 20MA, 100ns rise-time Z accelerator [M. K. Matzen et al., Phys. Plasmas 12, 055503 (2005)], the terawatt-class, multikilojoule, 526.57nm Z-Beamlet laser (ZBL) [P. K. Rambo et al., Appl. Opt. 44, 2421 (2005)], in conjunction with the 6.151keV, Mn–Heα curved-crystal imager [D. B. Sinars et al., Rev. Sci. Instrum. 75, 3672 (2004)], is capable of providing a high quality x radiograph per Z shot for various high-energy-density physics experiments. Enhancements to this imaging system during 2005 have led to the capture of inertial confinement fusion capsule implosion and complex hydrodynamics images of significantly higher quality. The three main improvements, all leading effectively to enhanced image plane brightness, were bringing the source inside the Rowland circle to approximately double the collection solid angle, replacing direct exposure film with Fuji BAS-TR2025 image plate (read with a Fuji BAS-5000 scanner), and generating a 0.3–0.6ns, ∼200J prepulse 2ns before the 1.0ns, ∼1kJ main pulse to more than double the 6.151keV flux produced compared with a single 1kJ pulse. It appears that the 20±5μm imaging resolution is limited by the 25μm scanning resolution of the BAS-5000 unit, and to this end, a higher resolution scanner will replace it. ZBL is presently undergoing modifications to provide two temporally separated images (“two-frame”) per Z shot for this system before the accelerator closes down in summer 2006 for the Z-refurbished (ZR) upgrade. In 2008, after ZR, it is anticipated that the high-energy petawatt (HEPW) addition to ZBL will be completed, possibly allowing high-energy 11.2224 and 15.7751keV Kα1 curved-crystal imaging to be performed. With an ongoing several-year project to develop a highly sensitive multiframe ultrafast digital x-ray camera (MUDXC), it is expected that two-frame HEPW 11 and 16keV imaging and four-frame ZBL 6.151keV curved-crystal imaging will be possible. MUDXC will be based on the technology of highly cooled silicon and germanium photodiode arrays and ultrafast, radiation-hardened integrated circuitry.
Publisher: Elsevier BV
Date: 06-2016
Publisher: Elsevier BV
Date: 11-2018
DOI: 10.1016/J.EJIM.2018.06.017
Abstract: Infection is one of the main reasons for hospitalization worldwide, and is associated with an increased risk of cardiovascular mortality. It is unclear whether this association is modified by the presence of reduced renal function. The aim of this study was to analyze the relationship between estimated glomerular filtration rate (eGFR) and cardiovascular mortality in patients hospitalized with infection. This cohort study included all adult, incident patients who were hospitalized at one of four hospitals in China between 2012 and 2015, had a discharge diagnosis of infection, and had a serum creatinine measurement at admission. Patients receiving renal replacement therapy were excluded. Hospital data were linked to death registry data. All-cause and cardiovascular mortality were evaluated according to admission eGFR [≥60 (reference), 30-59 and < 30 mL/min/1.73m During a median follow-up period of 2.39 years, 40,524 patients were hospitalized with infection (mean age 61 years, 54.3% female 18.4% diabetic). Of these, 4781 died. Lower admission eGFR was associated with progressively increased risks of cardiovascular mortality (≥60 mL/min/1.73m Patients hospitalized with infections and reduced renal function have significantly increased risks of cardiovascular mortality. Heart status should be carefully monitored following infections, especially for those with reduced renal function.
Publisher: No publisher found
Date: 2002
Publisher: Wiley
Date: 15-12-2020
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 31-08-2018
DOI: 10.2215/CJN.09590818
Publisher: BMJ
Date: 15-05-2006
Publisher: SAGE Publications
Date: 21-01-2020
Publisher: Wiley
Date: 07-11-2007
Publisher: Elsevier BV
Date: 07-2013
DOI: 10.1053/J.JRN.2012.07.001
Abstract: This study aims to establish the utility of the Nutrition Impact Symptoms (NIS), a part of the Patient-Generated Subjective Global Assessment (PG-SGA) as a nutritional screening tool in patients receiving hemodialysis (HD). This was a prospective observational study. The study took place in a single public tertiary in-center dialysis facility in Australia. Patients included 213 in iduals receiving maintenance HD for at least 3 months who were older than 18 years of age (mean age, 58.9 ± 16.3 years 55.4% [n = 118] male patients). Malnutrition, which was classified by the Subjective Global Assessment rating (SGA, B or C) and the nutrition-related clinical outcome (decline in weight [>5%], SGA, reduction in serum albumin [>5 g/L]), or 12-month mortality. Patients assessed as malnourished totaled 23.5% (n = 50). Total PG-SGA and NIS scores showed a comparable ability to predict malnutrition (area under the curve, 0.93 [95% confidence interval {CI}, 0.90-0.97] and 0.86 [95% CI, 0.80-0.93], respectively). NIS (score ≥2) was independently related to poor nutrition-related clinical outcome (odds ratio [OR], 3.03 95% CI, 1.47-6.20) and mortality (OR, 1.11 95% CI, 1.03-1.20) adjusted for age, dialysis vintage, serum albumin level, and comorbidities. NIS score is a promising nutritional screening tool for the identification of patients receiving hemodialysis who are at risk of malnutrition and poor clinical outcome. Further research is required to investigate the reliability and utility of this tool in a larger population group.
Publisher: Elsevier BV
Date: 03-2004
DOI: 10.1053/J.AJKD.2003.11.010
Abstract: Factors that predict peritoneal transport status in peritoneal dialysis (PD) patients are poorly understood. The aim of the present study is to determine these factors in Australian and New Zealand incident PD patients. The study included all patients on the Australian and New Zealand Dialysis and Transplant Registry who started PD therapy between April 1, 1991, and March 31, 2002, and underwent a peritoneal equilibration test (PET) within the first 6 months. Predictors of peritoneal transport category and dialysate-plasma creatinine ratio at 4 hours (D-P Cr 4h) were assessed by multivariate ordinal logistic regression and multiple linear regression, respectively. A total of 3,188 patients were studied. Mean D-P Cr 4h was 0.69 +/- 0.13. High transport status was associated with older age (adjusted odds ratio [OR], 1.08 for each 10 years 95% confidence interval [CI], 1.03 to 1.13), Maori and Pacific Islander racial origin (OR, 1.48 95% CI, 1.13 to 1.94), and normal body mass index (BMI 30 kg/m2: OR, 0.71 95% CI, 0.58 to 0.86), but was not independently predicted by sex, diabetes, other comorbid diseases, smoking, previous hemodialysis therapy or transplantation, or residual renal function. Similar results were found when peritoneal permeability was modeled as a continuous variable (D-P Cr 4h). In Australian and New Zealand PD patients, higher peritoneal transport status is independently associated with racial origin, older age, and lower BMI. The ersity of peritoneal transport characteristics in different ethnic populations suggests that additional validation of PET measurements in various racial groups and study of their relationship to patient outcomes are warranted.
Publisher: Elsevier BV
Date: 08-2018
Publisher: SAGE Publications
Date: 13-01-2020
Abstract: Different kidney replacement therapy modalities are available to manage end-stage kidney disease, such as home-based dialysis, in-center hemodialysis, and kidney transplantation. Although transitioning between modalities is common, data on how patients experience these transitions are scarce. This study explores patients’ perspectives of transitioning from a home-based to an in-center modality. Patients transitioning from peritoneal dialysis to in-center hemodialysis were purposively selected. Semi-structured interviews were performed, digitally recorded, and transcribed verbatim. Data analysis, consistent with Charmaz’ constructivist approach of grounded theory was performed. Fifteen patients (10 males mean age 62 years) participated. The conditions of the transitioning process impacted the participants’ experiences, resulting in ergent experiences and associated emotions. Some participants experienced a loss of control due to the therapy-related changes. Some felt tied down and having lost independence, whereas others stated they regained control as they felt relieved from responsibility. This paradox of control was related to the patient having or not having (1) experienced a fit of hemodialysis with their personal lifestyle, (2) a frame of reference, (3) higher care requirements, (4) insight into the underlying reasons for transitioning, and (5) trust in the healthcare providers. Care teams need to offer opportunities to elicit patients’ knowledge and fears, dispel myths, forge connections with other patients, and visit the dialysis unit before transition to alleviate anxiety. Interventions that facilitate a sense of control should be grounded in the meaning that the disorder has for the person and how it impacts their sense of self.
Publisher: Oxford University Press (OUP)
Date: 02-02-2018
DOI: 10.1093/CKJ/SFX153
Publisher: Elsevier BV
Date: 10-0002
DOI: 10.1016/J.KINT.2018.06.009
Abstract: Better prognostication of graft and patient outcomes among kidney transplant recipients with post-transplant lymphoproliferative disease (PTLD) in the rituximab era is needed to inform treatment decisions. Therefore, we sought to estimate the excess risks of death and graft loss in kidney transplant recipients with PTLD, and to determine risk factors for death. Using the ANZDATA registry, the risks of mortality and graft loss among recipients with and without PTLD were estimated using survival analysis. A group of 367 patients with PTLD (69% male, 85% white, mean age 43 years) were matched 1 to 4 to 1468 controls (69% male, 88% white, mean age 43 years), and followed for a mean of 16 years. Recipients with PTLD experienced poorer 10-year patient survival (41%, 95% confidence intervals 36-47%) than controls (65%, 63-68%). Excess mortality occurred in the first 2 years post-transplant (hazard ratio 8.5, 6.7-11), but not thereafter (1.0, 0.76-1.3). Cerebral lymphoma (2.0, 1.3-3.1), bone marrow disease (2.0, 1.2-3.3) and year of diagnosis prior to 2000 (2.2, 1.4-3.5 after 2000 reference) were risk factors of death. PTLD did not confer an excess risk of graft loss (1.08, 0.69-1.70). Thus, PTLD is a risk factor for death, particularly in the first two years after diagnosis. Cerebral or bone marrow diseases were associated with increased mortality risk, but overall survival in the rituximab era (post 2000) has improved.
Publisher: Wiley
Date: 12-2000
Publisher: Wiley
Date: 26-06-2002
Publisher: American Physiological Society
Date: 10-2014
DOI: 10.1152/AJPRENAL.00205.2014
Abstract: The mechanism(s) underlying renoprotection by peroxisome proliferator-activated receptor (PPAR)-γ agonists in diabetic and nondiabetic kidney disease are not well understood. Mitochondrial dysfunction and oxidative stress contribute to kidney disease. PPAR-γ upregulates proteins required for mitochondrial biogenesis. Our aim was to determine whether PPAR-γ has a role in protecting the kidney proximal tubular epithelium (PTE) against mitochondrial destabilisation and oxidative stress. HK-2 PTE cells were subjected to oxidative stress (0.2–1.0 mM H 2 O 2 ) for 2 and 18 h and compared with untreated cells for apoptosis, mitosis (morphology/biomarkers), cell viability (MTT), superoxide (dihydroethidium), mitochondrial function (MitoTracker red and JC-1), ATP (luminescence), and mitochondrial ultrastructure. PPAR-γ, phospho-PPAR-γ, PPAR-γ coactivator (PGC)-1α, Parkin (Park2), p62, and light chain (LC)3β were investigated using Western blots. PPAR-γ was modulated using the agonists rosiglitazone, pioglitazone, and troglitazone. Mitochondrial destabilization increased with H 2 O 2 concentration, ATP decreased (2 and 18 h P 0.05), Mitotracker red and JC-1 fluorescence indicated loss of mitochondrial membrane potential, and superoxide increased (18 h, P 0.05). Electron microscopy indicated sparse mitochondria, with disrupted cristae. Mitophagy was evident at 2 h (Park2 and LC3β increased p62 decreased). Impaired mitophagy was indicated by p62 accumulation at 18 h ( P 0.05). PPAR-γ expression decreased, phospho-PPAR-γ increased, and PGC-1α decreased (2 h), indicating aberrant PPAR-γ activation and reduced mitochondrial biogenesis. Cell viability decreased (2 and 18 h, P 0.05). PPAR-γ agonists promoted further apoptosis. In summary, oxidative stress promoted mitochondrial destabilisation in kidney PTE, in association with increased PPAR-γ phosphorylation. PPAR-γ agonists failed to protect PTE. Despite positive effects in other tissues, PPAR-γ activation appears to be detrimental to kidney PTE health when oxidative stress induces damage.
Publisher: Wiley
Date: 06-2009
DOI: 10.1111/J.1440-1797.2009.01114.X
Abstract: Chronic kidney disease mineral and bone disorder (CKD-MBD), characterized by disturbances of calcium hosphate arathyroid hormone, bone abnormalities and vascular and soft tissue calcification, is highly prevalent in CKD and is a strong, independent predictor of bone fracture, cardiovascular disease and death. Clinical practice guidelines, such as the Kidney Disease Outcomes Quality Initiative (KDOQI) and Caring for Australasians with Renal Insufficiency (CARI), support the use of phosphate binders, vitamin D compounds and calcimimetics for treatment of CKD-MBD and recommend stringent targets for serum calcium, phosphate and parathyroid hormone. However, these recommendations are based primarily on the results of observational cohort studies and randomized controlled trials employing surrogate outcome measures. The aim of this paper is to review the available evidence addressing whether therapeutic strategies targeting CKD-MBD and its surrogate outcome measures appreciably influence patient-level outcomes ('hard' clinical end-points).
Publisher: Wiley
Date: 2007
DOI: 10.1111/J.1542-4758.2007.00146.X
Abstract: Cardiovascular disease accounts for 40% to 50% of deaths in dialysis populations. Overall, the risk of cardiac mortality is 10-fold to 20-fold greater in dialysis patients than in age and sex-matched controls without chronic kidney disease. The aim of this paper is to review critically the evidence that cardiac outcomes in dialysis patients are modified by cardiovascular risk factor interventions. There is limited, but as yet inconclusive controlled trial evidence that cardiovascular outcomes in dialysis populations may be improved by antioxidants (vitamin E or acetylcysteine), ensuring that hemoglobin levels do not exceed 120 g/L (especially in the setting of known cardiovascular disease), prescribing carvedilol in the setting of dilated cardiomyopathy, and by using cinacalcet in uncontrolled secondary hyperparathyroidism. Similarly, there are a number of negative controlled trials, which have demonstrated that statins, high-dose folic acid, angiotensin-converting enzyme inhibitors, multiple risk factor intervention via multidisciplinary clinics, and high-dose or high-flux dialysis are ineffective in preventing cardiovascular disease. Although none of these studies could be considered conclusive, the negative trials to date should raise significant concerns about the heavy reliance of current clinical practice guidelines on extrapolation of findings from cardiovascular intervention trials in the general population. It may be that cardiovascular disease in dialysis populations is less amenable to intervention, either because of the advanced stage of chronic kidney disease or because the pathogenesis of cardiovascular disease in dialysis patients is different from that in the general population. Large, well-conducted, multicenter randomized-controlled trials in this area are urgently required.
Publisher: Oxford University Press (OUP)
Date: 22-04-2016
DOI: 10.1093/NDT/GFV115
Abstract: Existing Australasian and international guidelines outline antibiotic and antifungal measures to prevent the development of treatment-related infection in peritoneal dialysis (PD) patients. Practice patterns and rates of PD-related infection vary widely across renal units in Australia and New Zealand and are known to vary significantly from guideline recommendations, resulting in PD technique survival rates that are lower than those achieved in many other countries. The aim of this study was to determine if there is an association between current practice and PD-related infection outcomes and to identify the barriers and enablers to good clinical practice. This is a multicentre network study involving eight PD units in Australia and New Zealand, with a focus on adherence to guideline recommendations on antimicrobial prophylaxis in PD patients. Current practice was established by asking the PD unit heads to respond to a short survey about practice rotocols olicies and a 'process map' was constructed following a face-to-face interview with the primary PD nurse at each unit. The perceived barriers/enablers to adherence to the relevant guideline recommendations were obtained from the completion of 'cause and effect' diagrams by the nephrologist and PD nurse at each unit. Data on PD-related infections were obtained for the period 1 January 2011 to 31 December 2011. Perceived barriers that may result in reduced adherence to guideline recommendations included lack of knowledge, procedural lapses, lack of a centralized patient database, patients with non-English speaking background, professional concern about antibiotic resistance, medication cost and the inability of nephrologists and infectious diseases staff to reach consensus on unit protocols. The definitions of PD-related infections used by some units varied from those recommended by the International Society for Peritoneal Dialysis, particularly with exit-site infection (ESI). Wide variations were observed in the rates of ESI (0.06-0.53 episodes per patient-year) and peritonitis (0.31-0.86 episodes per patient-year). Despite the existence of strongly evidence-based guideline recommendations, there was wide variation in adherence to these recommendations between PD units which might contribute to PD-related infection rates, which varied widely between units. Although in idual patient characteristics may account for some of this variability, inconsistencies in the processes of care to prevent infection in PD patients also play a role.
Publisher: Wiley
Date: 10-2019
DOI: 10.5694/MJA2.50354
Abstract: Sex and age-specific incidence rates of patients with treated end-stage kidney disease (ESKD) in Australia are comparable to those in European countries, but substantially lower compared with those in the United States, Canada and many Asian countries. The incidence rates of treated ESKD in Australia increase with advancing age however, the incidence of ESKD is likely to be underestimated because a proportion of patients with ESKD (about 50%) remain untreated. Late referral to nephrologists has reduced over the past decade, temporally associated with improved ESKD recognition. However, late referral still occurs in one in five Australians with ESKD. One in two Australians with ESKD has diabetes, with up to 35% of cases directly attributed to diabetes. Mortality rates for patients with ESKD remain substantially higher compared with the age-matched general population, although there has been a significant improvement in survival over time. Cardiovascular disease and cancer are the two most common causes of death in patients with ESKD.
Publisher: Elsevier BV
Date: 07-2012
DOI: 10.1053/J.AJKD.2011.12.030
Abstract: Dialysis modality preferences of patients with chronic kidney disease (CKD) and family caregivers are important, yet rarely quantified. Prospective, unlabeled, discrete-choice experiment with random-parameter logit analysis. Adults with stages 3-5 CKD and caregivers educated about dialysis treatment options from 8 Australian renal clinics. Preferences for and trade-offs between the dialysis treatment attributes of life expectancy, number of hospital visits per week, ability to travel, hours per treatment, treatment time of day, subsidized transport service, and flexibility of treatment schedule. Results presented as ORs for preferring home-based or in-center dialysis to conservative care. 105 predialysis patients and 73 family caregivers completed the study. Median patient age was 63 years, and mean estimated glomerular filtration rate was 18.1 (range, 6-34) mL/min/1.73 m(2). Median caregiver age was 61 years. Home-based dialysis (either peritoneal or home hemodialysis) was chosen by patients in 65% of choice sets in-center dialysis, in 35% and conservative care, in 10%. For caregivers, this was 72%, 25%, and 3%, respectively. Both patients and caregivers preferred longer rather than shorter hours of dialysis (ORs of 2.02 [95% CI, 1.51-2.70] and 2.67 [95% CI, 1.85-3.85] for patients and caregivers, respectively), but were less likely to choose nocturnal than daytime dialysis (ORs of 0.07 [95% CI, 0.01-0.75] and 0.03 [95% CI, 0.01-0.20]). Patients were willing to forgo 23 (95% CI, 19-27) months of life expectancy with home-based dialysis to decrease their travel restrictions. For caregivers, this was 17 (95% CI, 16-18) patient-months. Data were limited to stated preferences rather than actual choice of dialysis modality. Our study suggests that it is rare for caregivers to prefer conservative nondialytic care for family members with CKD. Home-based dialysis modalities that enable patients and their family members to travel with minimal restriction would be strongly aligned with the preferences of both parties.
Publisher: Wiley
Date: 12-2010
DOI: 10.1111/J.1440-1797.2010.01433.X
Abstract: Randomized controlled clinical trials represent the gold standard of research into health-care interventions but conducting a randomized trial requires careful planning, structures and procedures. The conduct of a clinical trial is a collaborative effort between investigators, participants and a range of professionals involved both centrally and locally in the coordination and execution of the study. In this article, the key steps to conducting a randomized controlled trial are summarized.
Publisher: Elsevier BV
Date: 02-2019
DOI: 10.1016/J.CLNU.2017.11.020
Abstract: Patients on hemodialysis suffer from high risk of premature death, which is largely attributed to cardiovascular disease, but interventions targeting traditional cardiovascular risk factors have made little or no difference. Long chain n-3 polyunsaturated fatty acids (n-3 PUFA) are putative candidates to reduce cardiovascular disease. Diets rich in n-3 PUFA are recommended in the general population, although their role in the hemodialysis setting is uncertain. We evaluated the association between the dietary intake of n-3 PUFA and mortality for hemodialysis patients. The DIET-HD study is a prospective cohort study (January 2014-June 2017) in 9757 adults treated with hemodialysis in Europe and South America. Dietary n-3 PUFA intake was measured at baseline using the GA During a median follow up of 2.7 years (18,666 person-years), 2087 deaths were recorded, including 829 attributable to cardiovascular causes. One third of the study participants consumed sufficient (at least 1.75 g/week) n-3 PUFA recommended for primary cardiovascular prevention, and less than 10% recommended for secondary prevention (7-14 g/week). Compared to patients with the lowest tertile of dietary n-3 PUFA intake (<0.37 g/week), the adjusted hazard ratios (95% confidence interval) for cardiovascular mortality for patients in the middle (0.37 to <1.8 g/week) and highest (≥1.8 g/week) tertiles of n-3 PUFA were 0.82 (0.69-0.98) and 1.03 (0.84-1.26), respectively. Corresponding adjusted hazard ratios for all-cause mortality were 0.96 (0.86-1.08) and 1.00 (0.88-1.13), respectively. Dietary n-3 PUFA intake was not associated with cardiovascular or all-cause mortality in patients on hemodialysis. As dietary n-3 PUFA intake was low, the possibility that n-3 PUFA supplementation might mitigate cardiovascular risk has not been excluded.
Publisher: Elsevier BV
Date: 2018
DOI: 10.1053/J.AJKD.2017.08.018
Abstract: Advances in kidney transplantation have led to considerable improvements in short-term transplant and patient outcomes, but there are few data regarding long-term transplant outcomes in patients with vascular comorbid conditions. This study examined the association of vascular disease before transplantation with transplant and patient survival after transplantation and evaluated whether this association was modified by diabetes. All deceased donor kidney transplant recipients recorded in the Australia and New Zealand Dialysis and Transplant Registry (ANZDATA) for 1990 to 2012. Vascular disease burden. All-cause mortality and overall transplant loss. Potential interactions between diabetes and vascular disease for mortality and transplant loss were assessed using 2-way interaction terms. Of 7,128 recipients with 58,120 patient-years of follow-up, 854 (12.0%) and 263 (3.7%) had vascular diseases at 1 and 2 or more sites, respectively. Overall survival for recipients without vascular disease 15 years after transplantation was 65% compared with 35% and 22% among recipients with vascular disease at 1 and 2 or more sites, respectively (P<0.001). Compared with recipients without vascular disease, adjusted HRs for mortality and transplant loss were 1.75 (95% CI, 1.39-2.20 P<0.001) and 1.61 (95% CI, 1.30-1.99 P<0.001), respectively, for recipients with 2 or more vascular diseases. Among recipients without diabetes but with 2 or more vascular diseases, adjusted HRs for mortality and transplant loss were 2.10 (95% CI, 1.56-2.82 P<0.001) and 1.84 (95% CI, 1.39-2.42 P<0.001), respectively, compared with those without vascular disease. Similar associations were not observed for recipients with diabetes mellitus (P for interaction < 0.001). Selection bias and unmeasured residual confounders, such as the severity/extent of comorbid conditions likely to be present. The impact of vascular disease on long-term outcomes was modified by the presence of diabetes, whereby excess risks for death and transplant loss are more apparent in recipients without diabetes.
Publisher: Oxford University Press (OUP)
Date: 13-03-2013
DOI: 10.1093/NDT/GFT050
Abstract: Although icodextrin has been shown to augment peritoneal ultrafiltration in peritoneal dialysis (PD) patients, its impact upon other clinical end points, such as technique survival, remains uncertain. This systematic review evaluated the effect of icodextrin use on patient level clinical outcomes. The Cochrane CENTRAL Registry, MEDLINE, Embase and reference lists were searched (last search 13 September 2012) for randomized controlled trials of icodextrin versus glucose in the long dwell exchange. Summary estimates of effect were obtained using a random effects model. Eleven eligible trials (1222 patients) were identified. There was a significant reduction in episodes of uncontrolled fluid overload [two trials 100 patients relative risk (RR) 0.30, 95% confidence interval (CI) 0.15-0.59] and improvement in peritoneal ultrafiltration [four trials 102 patients mean difference (MD) 448.54 mL/day, 95% CI 289.28-607.80] without compromising residual renal function [four trials 114 patients standardized MD (SMD) 0.12, 95% CI -0.26 to 0.49] or urine output (three trials 69 patients MD -88.88, 95% CI -356.88 to 179.12) with icodextrin use for up to 2 years. There was no significant effect on peritonitis incidence (five trials 607 patients RR 0.97, 95% CI 0.76-1.23), peritoneal creatinine clearance (three trials 237 patients SMD 0.36, 95% CI -0.24 to 0.96), technique failure (three trials 290 patients RR 0.58, 95% CI 0.28-1.20), patient survival (six trials 816 patients RR 0.82, 95% CI 0.32-2.13) or adverse events. Icodextrin prescription improved peritoneal ultrafiltration, mitigated uncontrolled fluid overload and was not associated with increased risk of adverse events. No effects of icodextrin on technique or patient survival were observed, although trial s le sizes and follow-up durations were limited.
Publisher: Public Library of Science (PLoS)
Date: 25-03-2021
DOI: 10.1371/JOURNAL.PONE.0249000
Abstract: The need for kidney transplantation drives efforts to expand organ donation. The decision to accept organs from donors with acute kidney injury (AKI) can result in a clinical dilemma in the context of conflicting reports from published literature. This observational study included all deceased donor kidney transplants performed in Australia and New Zealand between 1997 and 2017. The association of donor-AKI, defined according to KDIGO criteria, with all-cause graft failure was evaluated by multivariable Cox regression. Secondary outcomes included death-censored graft failure, death, delayed graft function (DGF) and acute rejection. The study included 10,101 recipients of kidneys from 5,774 deceased donors, of whom 1182 (12%) recipients received kidneys from 662 (11%) donors with AKI. There were 3,259 (32%) all-cause graft failures, which included 1,509 deaths with functioning graft. After adjustment for donor, recipient and transplant characteristics, donor AKI was not associated with all-cause graft failure (adjusted hazard ratio [HR] 1.11, 95% CI 0.99–1.26), death-censored graft failure (HR 1.09, 95% CI 0.92–1.28), death (HR 1.15, 95% CI 0.98–1.35) or graft failure when death was evaluated as a competing event (sub-distribution hazard ratio [sHR] 1.07, 95% CI 0.91–1.26). Donor AKI was not associated with acute rejection but was associated with DGF (adjusted odds ratio [OR] 2.27, 95% CI 1.92–2.68). Donor AKI stage was not associated with any kidney transplant outcome, except DGF. Use of kidneys with AKI for transplantation appears to be justified.
Publisher: Wiley
Date: 12-2004
DOI: 10.1111/J.1440-1797.2004.00355.X
Abstract: In this second of two articles regarding the renal toxicities or benefits of medicinal herbs, herbs are reported as being 'potentially beneficial' to the kidneys if there is strong in vivo evidence of renal protection from toxic substances or drugs potent, specific renal anti-oxidant effects in vivo cancer antiproliferative effects specific to the kidneys or in vivo evidence of being beneficial in renal disease or failure. Among the herbs, polyherbal formulae and fungi with potential renal benefits are Cordyceps sinensis, Sairei-to, Rheum spp., Salvia miltiorrhiza and its component, magnesium lithospermate B and others.
Publisher: Springer Science and Business Media LLC
Date: 05-03-2018
DOI: 10.1038/S41598-018-22335-4
Abstract: Technique failure is a frequent complication of peritoneal dialysis (PD), but the association between causes of death-censored technique failure and mortality remains unclear. Using Australian and New Zealand Dialysis and Transplant (ANZDATA) registry data, we examined the associations between technique failure causes and mortality in all incident PD patients who experienced technique failure between 1989–2014. Of 4663 patients, 2415 experienced technique failure attributed to infection, 883 to inadequate dialysis, 836 to mechanical failure and 529 to social reasons. Compared to infection, the adjusted hazard ratios (HR) for all-cause mortality in the first 2 years were 0.83 (95%CI 0.70–0.98) for inadequate dialysis, 0.78 (95%CI 0.66–0.93) for mechanical failure and 1.46 (95%CI 1.24–1.72) for social reasons. The estimates from the competing risk models were similar. There was an interaction between age and causes of technique failure (p interaction 0.001), such that the greatest premature mortality was observed in patients aged years post social-related technique failure. There was no association between causes of technique failure and mortality beyond 2 years. In conclusion, infection and social-related technique failure are associated with premature mortality within 2 years post technique failure. Future studies examining the associations may help to improve outcomes in these patients.
Publisher: Elsevier BV
Date: 05-2003
DOI: 10.1016/S0272-6386(03)00199-9
Abstract: Abnormalities of the left ventricle are common in patients with end-stage renal disease (ESRD) both before and after the start of renal replacement therapy. The purpose of this study is to identify possible causes of subclinical left ventricular (LV) dysfunction in patients with ESRD. In particular, we sought to determine whether the presence of ESRD was itself associated with dysfunction independent of LV hypertrophy and coronary artery disease. Assessment of cardiovascular risk factors and dialysis adequacy was completed in 145 unselected patients with ESRD who were recruited from the renal dialysis unit and compared with age- and sex-matched controls. Among the 68 patients with ESRD who had undergone a dobutamine stress echocardiogram with normal findings, regional cardiac function was quantified by myocardial Doppler velocity, LV volumes and mass were measured using three-dimensional echocardiography, and vascular function was assessed using brachial artery reactivity (BAR). LV diastolic velocity was impaired in patients with ESRD, but there was no significant difference in systolic velocity compared with control patients of similar age. Age, diabetes mellitus, hypertension, and LV mass were independent predictors of diastolic velocity (model R2 = 0.45 P < 0.001), whereas age and risk factor number were predictors of systolic velocity (model R2 = 0.19 P = 0.002). Increasing risk factor number had no significant relationship with LV mass or volume. There was no detected association between BAR and incremental risk factors (P = 0.51). Subclinical LV dysfunction occurs in patients with ESRD, but is evidenced as abnormal myocardial diastolic, rather than systolic, function. Correlates of abnormal function are age, diabetes mellitus, hypertension, and LV mass, rather than ESRD alone, dialysis adequacy, or abnormal endothelial function.
Publisher: Elsevier BV
Date: 2023
DOI: 10.1053/J.JRN.2022.04.003
Abstract: Nutrition supplementation, including prebiotics and probiotics, is a therapeutic strategy for modulating the gut microbiome in chronic kidney disease (CKD). However, the acceptability of gut-targeted supplements in this population remains largely unexplored. This study aims to describe the perceptions of nutrition supplementation, and the acceptability and experiences of pre- and probiotics in adults with Stage 3-4 CKD. Semi-structured interview study of adults with Stage 3-4 CKD (n = 30), aged 41-80 (mean 68) years, who completed a 12-month prebiotic and probiotic intervention or placebo, were interviewed between January and March 2019. Interviews were transcribed verbatim and analyzed thematically. Five themes were identified: integrating and sustaining routine supplementation (flexibility in prescription of prebiotics and probiotics, fitting in with regular routines) striving for health benefits (hoping to improve kidney health, hoping to improve general health, confirming health benefits) facilitating pre- and probiotic supplementation (perceiving pre- and probiotics as safe, side-effects from taking pre- and probiotics) empowering knowledge (valuing the opportunity to increase knowledge of gut health) and considerations for future use (questioning credibility of health claims, average palatability of prebiotic powder, cost concerns). Adults with Stage 3-4 CKD found pre- and probiotic supplements to be acceptable and complementary gut-targeted supplements. In idual preferences for nutrition supplementation should be considered alongside health knowledge to enhance uptake and adherence in practice.
Publisher: Bentham Science Publishers Ltd.
Date: 03-2010
DOI: 10.2174/1871520611009030186
Abstract: Silymarin and its major constituent, Silibinin, are extracts from the medicinal plant Silybum marianum (milk thistle) and have traditionally been used for the treatment of liver diseases. Recently, these orally active, flavonoid agents have also been shown to exert significant anti-neoplastic effects in a variety of in vitro and in vivo cancer models, including skin, breast, lung, colon, bladder, prostate and kidney carcinomas. The aim of the present review is to examine the pharmacokinetics, mechanisms, effectiveness and adverse effects of silibinins anti-cancer actions reported to date in pre-clinical and clinical trials. The review will also discuss the results of current research efforts seeking to determine the extent to which the effectiveness of silibinin as an adjunct cancer treatment is influenced by such factors as histologic subtype, hormonal status, stromal interactions and drug metabolising gene polymorphisms. The results of these studies may help to more precisely target and dose silibinin therapy to optimise clinical outcomes for oncology patients.
Publisher: AMPCo
Date: 10-2007
DOI: 10.5694/J.1326-5377.2007.TB01357.X
Abstract: Since publication of the Australasian Creatinine Consensus Working Group's position statement in 2005, most Australasian laboratories now automatically report an estimated glomerular filtration rate (eGFR) (based on the Modification of Diet in Renal Disease [MDRD] formula) with results of serum creatinine tests in adults. Anecdotal evidence suggests that automatic reporting of eGFR helps to identify asymptomatic kidney dysfunction at an earlier stage and to develop rational and appropriate management plans. Changes to the measurement and calibration of serum creatinine assays and issues regarding implementation of eGFR in clinical practice led the Australasian Creatinine Consensus Working Group to reconvene in 2007. The recommendations contained here build on the original 2005 position statement and consolidate the role of eGFR in clinical practice. The Working Group recommends that the eGFR upper reporting limit be extended to 90 mL/min/1.73 m2, with eGFR values above this amount being reported as "> 90 mL/min/1.73 m2", rather than as a precise figure. The Working Group has concluded that it is currently premature to recommend age-related decision points for eGFR. However, it is appropriate to advise medical practitioners that, in people aged >/= 70 years, an eGFR in the range 45-59 mL/min/1.73 m2, if stable over time and unaccompanied by other evidence of kidney damage, may be interpreted as consistent with a typical eGFR for this age group and is unlikely to be associated with chronic kidney disease-related complications. Pending publication of validation studies, the Working Group recommends that Australasian laboratories continue to automatically report eGFR in Aboriginal and Torres Strait Islander peoples and other ethnic groups. The Working Group supports the use of eGFR to assist drug dosing decision making in general practice.
Publisher: Elsevier BV
Date: 02-2018
Publisher: Elsevier BV
Date: 02-2018
Publisher: Elsevier BV
Date: 02-2018
Publisher: Springer US
Date: 2016
Publisher: SAGE Publications
Date: 11-2011
Publisher: Springer Science and Business Media LLC
Date: 28-11-2019
DOI: 10.1186/S13063-019-3784-7
Abstract: Vitamin D deficiency has been shown to be closely associated with peritoneal dialysis (PD)-related peritonitis. The aim of this study is to examine the feasibility of conducting a large, powered randomized controlled trial to determine the effects of vitamin D supplementation on the risk of PD-related peritonitis in patients who have already experienced an episode of peritonitis. This prospective, open-label randomized controlled pilot trial with blinded end-points aims to determine the feasibility of oral vitamin D supplementation and to explore its effects on the risk of subsequent PD-related peritonitis among PD patients who have recovered from a recent episode of peritonitis. Eligible patients will be randomized 1:1 to either oral vitamin D supplementation (2000 IU per day intervention group) or no vitamin D supplementation (control group) in addition to usual care according to International Society for Peritoneal Dialysis guidelines. The s le size will be 30 patients for both groups. All participants will be followed for 12 months. The primary outcome is the assessment of feasibility (recruitment success, retention, adherence, safety) and fidelity (change in serum 25-hydroxyvitamin D level during follow-up) for a large, powered randomized controlled trial to determine the effects of vitamin D on the risk of PD-related peritonitis in the future. Secondary outcomes include time to peritonitis occurrence, recovery of peritonitis, peritonitis-related transition to hemodialysis, and peritonitis-related death (defined as death within 30 days of peritonitis onset). This is the first randomized controlled trail investigating the effects of vitamin D supplementation on the risk of subsequent PD-related peritonitis among patients on PD. The findings for this pilot study will determine the feasibility of conducting a full-scale randomized controlled trail, which may provide a new strategy for preventing PD-related peritonitis among PD patients. Clinicaltrails.gov, NCT03264625 . Registered on 29 August 2017.
Publisher: Wiley
Date: 30-03-2021
Publisher: SAGE Publications
Date: 09-2016
Publisher: Elsevier BV
Date: 10-2019
DOI: 10.1053/J.AJKD.2019.03.424
Abstract: In the general population, cognitive impairment is associated with increased mortality, and higher levels of education are associated with lower risks for cognitive impairment and mortality. These associations are not well studied in patients receiving long-term hemodialysis and were the focus of the current investigation. Prospective cohort study. Adult hemodialysis patients treated in 20 Italian dialysis clinics. Patients' cognitive function across 5 domains (memory, attention, executive function, language, and perceptual-motor function), measured using a neuropsychological assessment comprising 10 tests and patients' self-reported years of education. All-cause mortality. Nested multivariable Cox regression models were used to examine associations of cognition (any domain impaired, number of domains impaired, and global function score from principal components analysis of unadjusted test scores) and education with mortality and whether there were interactions between them. 676 (70.6%) patients participated, with a median age of 70.9 years and including 38.8% women. Cognitive impairment was present in 79.4% (527/664 95% CI, 76.3%-82.5%). During a median follow-up of 3.3 years (1,874 person-years), 206 deaths occurred. Compared to no cognitive impairment, adjusted HRs for mortality were 1.77 (95% CI, 1.07-2.93) for any impairment, 1.48 (95% CI, 0.82-2.68) for 1 domain impaired, 1.88 (95% CI, 1.01-3.53) for 2 domains, and 2.01 (95% CI, 1.14-3.55) for 3 to 5 domains. The adjusted HR was 0.68 (95% CI, 0.51-0.92) per standard deviation increase in global cognitive function score. Compared with primary or lower education, adjusted HRs were 0.79 (95% CI, 0.53-1.20) for lower secondary and 1.13 (95% CI, 0.80-1.59) for upper secondary or higher. The cognition-by-education interaction was not significant (P=0.7). Potential selection bias from nonparticipation and missing data no data for cognitive decline associations with education were not adjusted for other socioeconomic factors. Cognitive impairment is associated with premature mortality in hemodialysis patients. Education does not appear to be associated with mortality.
Publisher: Oxford University Press (OUP)
Date: 24-07-2014
DOI: 10.1093/NDT/GFU254
Abstract: Alport syndrome is a rare inheritable renal disease. Clinical outcomes for patients progressing to end-stage kidney disease (ESKD) are not well described. This study aimed to investigate the characteristics and clinical outcomes of patients from Australia and New Zealand commencing renal replacement therapy (RRT) for ESKD due to Alport syndrome between 1965 and 1995 (early cohort) and between 1996 and 2010 (contemporary cohort) compared with propensity score-matched, RRT-treated, non-Alport ESKD controls. A total of 58 422 patients started RRT during this period of which 296 (0.5%) patients had Alport ESKD. In the early cohort, Alport ESKD was associated with superior dialysis patient survival [adjusted hazard ratio (HR): 0.41, 95% confidence interval (CI): 0.20-0.83, P = 0.01], renal allograft survival (HR: 0.74, 95% CI: 0.54-1.01, P = 0.05) and renal transplant patient survival (HR: 0.43, 95% CI: 0.28-0.66, P < 0.001) compared with controls. In the contemporary cohort, no differences were observed between the two groups for dialysis patient survival (HR: 1.42, 95% CI: 0.65-3.11, P = 0.38), renal allograft survival (HR: 1.01, 95% CI: 0.57-1.79, P = 0.98) or renal transplant patient survival (HR: 0.67, 95% CI: 0.26-1.73, P = 0.41). One Alport patient (0.4%) had post-transplant anti-glomerular basement membrane (anti-GBM) disease. Four female and 41 male Alport patients became parents on RRT with generally good neonatal outcomes. Alport syndrome patients experienced comparable dialysis and renal transplant outcomes to matched non-Alport ESKD controls in the contemporary cohort due to relatively greater improvements in outcomes for non-Alport ESKD patients over time. Post-transplant anti-GBM disease was rare.
Publisher: Croatian Society for Medical Biochemistry and Laboratory Medicine
Date: 2007
DOI: 10.11613/BM.2007.003
Publisher: Elsevier BV
Date: 03-2020
DOI: 10.1053/J.AJKD.2019.05.028
Abstract: Clinical practice guidelines for dietary intake in hemodialysis focus on in idual nutrients. Little is known about associations of dietary patterns with survival. We evaluated the associations of dietary patterns with cardiovascular and all-cause mortality among adults treated by hemodialysis. Prospective cohort study. 8,110 of 9,757 consecutive adults on hemodialysis (January 2014 to June 2017) treated in a multinational private dialysis network and with analyzable dietary data. Data-driven dietary patterns based on the GA Cardiovascular and all-cause mortality. Principal components analysis with varimax rotation to identify common dietary patterns. Adjusted proportional hazards regression analyses with country as a random effect to estimate the associations between dietary pattern scores and mortality. Associations were expressed as adjusted HRs with 95% CIs, using the lowest quartile score as reference. During a median follow-up of 2.7 years (18,666 person-years), there were 2,087 deaths (958 cardiovascular). 2 dietary patterns, "fruit and vegetable" and "Western," were identified. For the fruit and vegetable dietary pattern score, adjusted HRs, in ascending quartiles, were 0.94 (95% CI, 0.76-1.15), 0.83 (95% CI, 0.66-1.06), and 0.91 (95% CI, 0.69-1.21) for cardiovascular mortality and 0.95 (95% CI, 0.83-1.09), 0.84 (95% CI, 0.71-0.99), and 0.87 (95% CI, 0.72-1.05) for all-cause mortality. For the Western dietary pattern score, the corresponding estimates were 1.10 (95% CI, 0.90-1.35), 1.11 (95% CI, 0.87-1.41), and 1.09 (95% CI, 0.80-1.49) for cardiovascular mortality and 1.01 (95% CI, 0.88-1.16), 1.00 (95% CI, 0.85-1.18), and 1.14 (95% CI, 0.93-1.41) for all-cause mortality. Self-reported food frequency questionnaire, data-driven approach. These findings did not confirm an association between mortality among patients receiving long-term hemodialysis and the extent to which dietary patterns were either high in fruit and vegetables or consistent with a Western diet.
Publisher: SAGE Publications
Date: 11-1998
DOI: 10.1177/089686089801800603
Abstract: In view of previous studies demonstrating hyperleptinemia in uremic and hemodialysis patients, the aims of the present study were to determine whether serum leptin levels are elevated in peritoneal dialysis (PD) patients, to establish whether leptin is significantly removed by PD, and to elucidate the relationship of plasma leptin to body composition, dietary intake, nutritional indices, and dialysis adequacy. Cross-sectional analysis of PD patients and matched healthy controls. Tertiary-care institutional dialysis center. The study included 49 PD patients [35 women and 14 men median age 63 years, interquartile range (IQR) 49.5 -68.5 yr body mass index (BMI) 25.5: I: 0.8] and 27 controls (11 men and 16 women median age 42 years, IQR 34.8 51 BMI 27.2: I: 0.9). For evaluation of leptin clearance, 8 patients receiving nocturnal intermittent PD were also evaluated. The primary outcome measure was plasma leptin concentration. Dialysate leptin concentration was also measured in 7 patients. Serum leptin levels were significantly higher (p 0.01) in patients (males: median 11 nglmL, IQR 9 19 ng/mL females: 53 ng/mL, 19.5 -128 ng/mL) compared with controls (males: 5.5 nglmL, 4 9.5 nglmL females: 12 ng/mL, 9.8 17.3 ng/mL). Leptin levels in both groups correlated positively with BMI (r = 0.64 and 0.60, respectively p 0.0001) and with percentage body fat determined by dual-energy x-ray absorptiometry (r = 0.86 and 0.82, respectively p 0.01). Dialysis patients exhibited a greater increase in serum leptin for any given increase in BMI. No significant correlation was observed between leptin concentration and residual renal function, dialysis adequacy (Kt/V), dietary protein or caloric intake, or serum levels of albumin, prealbumin, C-reactive protein, glucose, and insulin-like growth factor-1. Although leptin was detectable in peritoneal dialysate after a 6-hour dwell (median 4.2 ng/ mL, IQR 1.1 -8.5 ng/mL, n = 8), serum leptin levels were not appreciably lowered following intermittent PD via an automated cycler (63.9: I: 19.3 ng/mL vs 57.6: I: 20.5 ng/mL, p = NS, n = 8). Serum leptin levels are elevated in PD patients and are not appreciably cleared by PD. Although hyperleptinemia correlates poorly with dialysis adequacy and protein intake, a strong and significant relationship was maintained between serum leptin and fat mass. Serum leptin could therefore serve as a useful clinical marker of body fat content in PD patients.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 2006
Publisher: Elsevier BV
Date: 02-2018
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 2009
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 2008
DOI: 10.2215/CJN.02440607
Publisher: Elsevier BV
Date: 02-2018
Publisher: Elsevier BV
Date: 02-2018
Publisher: Public Library of Science (PLoS)
Date: 19-07-2021
DOI: 10.1371/JOURNAL.PONE.0254931
Abstract: Many studies have explored patients’ experiences of dialysis and other treatments for kidney failure. This is the first qualitative multi-site international study of how staff perceive the process of a patient’s transition from peritoneal dialysis to in-centre haemodialysis. Current literature suggests that transitions are poorly coordinated and may result in increased patient morbidity and mortality. This study aimed to understand staff perspectives of transition and to identify areas where clinical practice could be improved. Sixty-one participants (24 UK and 37 Australia), representing a cross-section of kidney care staff, took part in seven focus groups and sixteen interviews. Data were analysed inductively and findings were synthesised across the two countries. For staff, good clinical practice included: effective communication with patients, well planned care pathways and continuity of care. However, staff felt that how they communicated with patients about the treatment journey could be improved. Staff worried they inadvertently made patients fear haemodialysis when trying to explain to them why going onto peritoneal dialysis first is a good option. Despite staff efforts to make transitions smooth, good continuity of care between modalities was only reported in some of the Australian hospitals where, unlike the UK, patients kept the same consultant. Timely access to an appropriate service, such as a psychologist or social worker, was not always available when staff felt it would be beneficial for the patient. Staff were aware of a disparity in access to kidney care and other healthcare professional services between some patient groups, especially those living in remote areas. This was often put down to the lack of funding and capacity within each hospital. This research found that continuity of care between modalities was valued by staff but did not always happen. It also highlighted a number of areas for consideration when developing ways to improve care and provide appropriate support to patients as they transition from peritoneal dialysis to in-centre haemodialysis.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 15-07-2004
Publisher: Wiley
Date: 05-09-2019
Publisher: SAGE Publications
Date: 07-2017
Abstract: Glucose is the primary osmotic medium used in most peritoneal dialysis (PD) solutions, and exposure to glucose has been shown to exert detrimental effects both locally, at the peritoneal membrane, and systemically. Moreover, high dialysate glucose exposure may predispose patients to an increased risk of peritonitis, perhaps as a result of impaired host defences, vascular disease, and damage to the peritoneal membrane. In this post-hoc analysis of a multicenter, multinational, open-label randomized controlled trial of neutral pH, low-glucose degradation product (GDP) versus conventional PD solutions ( balANZ trial), the relationship between peritonitis rates of low ( 123.1 g/day) versus high (≥ 123.1 g/day) dialysate glucose exposure was evaluated in 177 incident PD patients over a 2-year study period. Peritonitis rates were 0.44 episodes per patient-year in the low-glucose exposure group and 0.31 episodes per patient-year in the high-glucose exposure group, (incidence rate ratio [IRR] 0.69, p = 0.09). There was no significant association between dialysate glucose exposure and peritonitis-free survival on univariable analysis (high glucose exposure hazard ratio [HR] 0.66, 95% confidence interval [CI] 0.40 –1.08) or on multivariable analysis (adjusted HR 0.64, 95% CI 0.39 – 1.05). Moreover, there was no relationship between peritoneal glucose exposure and type of organism causing peritonitis. Physician-rated severity of first peritonitis episodes was similar between groups, as was rate and duration of hospital admission. Overall, this study did not identify an association between peritoneal dialysate glucose exposure and peritonitis occurrence, severity, hospitalization, or outcomes. A further large-scale, prospective, randomized controlled trial evaluating patient-level outcomes is merited.
Publisher: Wiley
Date: 04-2005
DOI: 10.1111/J.1440-1797.2005.00374.X
Abstract: Aboriginal patients maintained on peritoneal dialysis (PD) have a higher rate of technique failure than any other racial group in Australia. Peritonitis accounts for the bulk of these technique failures, but it is uncertain whether the increased risk of peritonitis in Aboriginal patients was independent of associated comorbid conditions, such as diabetes mellitus. Using data collected by the Australia and New Zealand Dialysis and Transplant Registry (ANZDATA), peritonitis rates and time to first peritonitis were compared between Aboriginal (n = 238) and non-Aboriginal patients (n = 2924) commencing PD in Australia between 1 April 1999 and 31 March 2003. Aboriginal PD patients were younger, and had a higher incidence of diabetes than their non-Aboriginal counterparts. Mean peritonitis rates were significantly higher among Aboriginal (1.15 episodes/year 95% confidence interval (CI): 1.03-1.28) than non-Aboriginal patients (0.60 episodes/year 95% CI: 0.57-0.62, P < 0.05). Using multivariate negative binomial regression, independent predictors of higher peritonitis rates include Aboriginal racial origin (adjusted odds ratio 1.78 95% CI: 1.45-2.19), obesity, age and absence of a recorded dialysate : plasma creatinine ratio (D/P creatinine) measurement. Aboriginal racial origin was also associated with a shorter median time to first peritonitis (9.9 vs 19.3 months, P < 0.05), which remained statistically significant in a multivariate Cox proportional hazards model (adjusted hazard ratio 1.76 95% CI: 1.47-2.11, P < 0.05). Aboriginal and obese PD patients have a higher rate of peritonitis and a shorter time to first peritonitis, independent of demographic and comorbid factors. Further investigation of the causes of increased peritonitis risk in Aboriginal patients is needed.
Publisher: Elsevier BV
Date: 2003
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 10-2004
Publisher: The American Association of Immunologists
Date: 15-09-2016
Abstract: The antimicrobial responsiveness and function of unconventional human T cells are poorly understood, with only limited access to relevant specimens from sites of infection. Peritonitis is a common and serious complication in in iduals with end-stage kidney disease receiving peritoneal dialysis. By analyzing local and systemic immune responses in peritoneal dialysis patients presenting with acute bacterial peritonitis and monitoring in iduals before and during defined infectious episodes, our data show that Vγ9/Vδ2+ γδ T cells and mucosal-associated invariant T cells accumulate at the site of infection with organisms producing (E)-4-hydroxy-3-methyl-but-2-enyl pyrophosphate and vitamin B2, respectively. Such unconventional human T cells are major producers of IFN-γ and TNF-α in response to these ligands that are shared by many microbial pathogens and affect the cells lining the peritoneal cavity by triggering local inflammation and inducing tissue remodeling with consequences for peritoneal membrane integrity. Our data uncover a crucial role for Vγ9/Vδ2 T cells and mucosal-associated invariant T cells in bacterial infection and suggest that they represent a useful predictive marker for important clinical outcomes, which may inform future stratification and patient management. These findings are likely to be applicable to other acute infections where local activation of unconventional T cells contributes to the antimicrobial inflammatory response.
Publisher: Wiley
Date: 28-06-2008
Publisher: Hindawi Limited
Date: 2012
DOI: 10.1155/2012/673631
Abstract: Objective . This paper assessed the effectiveness of pre-, pro-, and synbiotics on reducing two protein-bound uremic toxins, p-cresyl sulphate (PCS) and indoxyl sulphate (IS). Methods . English language studies reporting serum, urinary, or fecal PCS and/or IS (or their precursors) following pre-, pro-, or synbiotic interventions ( day) in human adults were included. Population estimates of differences in the outcomes between the pre- and the postintervention were estimated for subgroups of studies using four meta-analyses. Quality was determined using the GRADE approach. Results . 19 studies met the inclusion criteria, 14 in healthy adults and five in haemodialysis patients. Eight studies investigated prebiotics, six probiotics, one synbiotics, one both pre- and probiotics, and three studies trialled all three interventions. The quality of the studies ranged from moderate to very low . 12 studies were included in the meta-analyses with all four meta-analyses reporting statistically significant reductions in IS and PCS with pre- and probiotic therapy. Conclusion . There is a limited but supportive evidence for the effectiveness of pre- and probiotics on reducing PCS and IS in the chronic kidney disease population. Further studies are needed to provide more definitive findings before routine clinical use can be recommended.
Publisher: Elsevier BV
Date: 03-2009
DOI: 10.1016/J.MAD.2008.10.003
Abstract: Chronic kidney disease (CKD) in ageing is a burden on health systems worldwide. Rat models of age-related CKD linked with obesity and hypertension were used to investigate alterations in oxidant handling and energy metabolism to identify gene targets or markers for age-related CKD. Young adult (3 months) and old (21-24 months) spontaneously-hypertensive (SHR), normotensive Wistar-Kyoto (WKY) and Wistar rats (normotensive, obese in ageing) were compared for renal functional and physiological parameters, renal fibrosis and inflammation, oxidative stress (hemeoxygenase-1/HO-1), apoptosis and cell injury (including Bax:Bcl-2), phosphorylated and non-phosphorylated forms of oxidant and energy sensing proteins (p66Shc, AMPK), signal transduction proteins (ERK1/2, PKB), and transcription factors (NF-kappaB, FoxO1). All old rats were normoglycemic. Renal fibrosis, tubular epithelial apoptosis, interstitial macrophages and myofibroblasts (all p<0.05), p66Shc hospho-p66 (p<0.05), Bax/Bcl-2 ratio (p<0.05) and NF-kappaB expression (p<0.01) were highest in old obese Wistars. Expression of phospho-FoxO/FoxO was elevated in old Wistars (p<0.001) and WKYs (p<0.01). SHRs had high levels in young and old rats. Expression of PKB, phospho-PKB, ERK1/2 and phospho-ERK1/2 were significantly elevated in all aged animals. These results suggest that obesity and hypertension have differing oxidant handling and signalling pathways that act in the pathogenesis of age-related CKD.
Publisher: Wiley
Date: 05-2009
Publisher: Elsevier BV
Date: 07-2020
Publisher: Wiley
Date: 03-2010
DOI: 10.1111/J.1440-1797.2009.01268.X
Abstract: The incidence of hepatitis B virus (HBV) infection in dialysis populations has declined over recent decades, largely because of improvements in infection control and widespread implementation of HBV vaccination. Regardless, outbreaks of infection continue to occur in dialysis units, and prevalence rates remain unacceptably high. For a variety of reasons, dialysis patients are at increased risk of acquiring HBV. They also demonstrate different disease manifestations compared with healthy in iduals and are more likely to progress to chronic carriage. This paper will review the epidemiology, modes of transmission and diagnosis of HBV in this population. Prevention and treatment will be discussed, with a specific focus on strategies to improve vaccination response, new therapeutic options and selection of patients for therapy.
Publisher: Elsevier BV
Date: 2004
DOI: 10.1053/J.AJKD.2003.09.012
Abstract: Cardiac mortality is the main cause of death in patients with chronic kidney disease (CKD). In this study, we sought the efficacy of long-term intensive lipid level lowering on atherosclerotic burden in patients with CKD. Patients with CKD (n = 38 age, 64 +/- 11 years) and a similar group of patients with coronary artery disease (CAD n = 31) were treated prospectively with atorvastatin, up to 80 mg/d. Lipid profile, carotid intima-media thickness (IMT a marker of atherosclerotic burden), and dobutamine echocardiography were measured at baseline and 2 years. Predictors of change in maximal IMT were sought in a linear model. Despite similar cholesterol level lowering, patients with CAD showed an improvement in maximum IMT, whereas those with CKD did not (mean between-group difference, 0.07 mm 95% confidence interval, 0.01 to 0.12). Change in maximal IMT was associated with kidney disease (R2 = 0.09 P = 0.013), smoking (R2 = 0.083 P = 0.017), baseline low-density lipoprotein cholesterol (LDL-C) level (R2 = 0.064 P = 0.045), very low density cholesterol (VLDL-C) level (R2 = 0.084 P = 0.021), and calcium channel blocker use (R2 = 0.094 P = 0.01). In a multivariate model, kidney disease and baseline LDL-C and VLDL-C levels remained independent predictors of change in maximal IMT (model R2 = 0.24 P = 0.004). Only patients with CAD decreased their number of ischemic segments (2.5 +/- 1.4 to 1.2 +/- 1.5 segments P = 0.002). Overall change in ischemic segment number correlated with change in maximal IMT (r = 0.32 P = 0.019). Patients with CKD undergoing intensive lipid level lowering do not show the same changes in atherosclerotic or ischemic burden as patients with CAD. Independent predictors of change in maximal IMT were CKD and baseline LDL-C and VLDL-C levels.
Publisher: Springer Science and Business Media LLC
Date: 03-05-2021
DOI: 10.1038/S41598-021-88786-4
Abstract: To examine if skin autofluorescence (sAF) differed in early adulthood between in iduals with type 1 diabetes and age-matched controls and to ascertain if sAF aligned with risk for kidney disease. Young adults with type 1 diabetes ( N = 100 20.0 ± 2.8 years M:F 54:46 FBG-11.6 ± 4.9 mmol/mol diabetes duration 10.7 ± 5.2 years BMI 24.5(5.3) kg/m 2 ) and healthy controls ( N = 299 20.3 ± 1.8 years M:F-83:116 FBG 5.2 ± 0.8 mmol/L BMI 22.5(3.3) kg/m 2 ) were recruited. Skin autofluorescence (sAF) and circulating AGEs were measured. In a subset of both groups, kidney function was estimated by GFR CKD-EPI CysC and uACR, and DKD risk defined by uACR tertiles. Youth with type 1 diabetes had higher sAF and BMI, and were taller than controls. For sAF, 13.6% of variance was explained by diabetes duration, height and BMI ( P model = 1.5 × 10 –12 ). In the sub-set examining kidney function, eGFR and sAF were higher in type 1 diabetes versus controls. eGFR and sAF predicted 24.5% of variance in DKD risk ( P model = 2.2 × 10 –9 ), which increased with diabetes duration (51% P model 2.2 × 10 –16 ) and random blood glucose concentrations (56% P model 2.2 × 10 –16 ). HbA 1C and circulating fructosamine albumin were higher in in iduals with type 1 diabetes at high versus low DKD risk. eGFR was independently associated with DKD risk in all models. Higher eGFR and longer diabetes duration are associated with DKD risk in youth with type 1 diabetes. sAF, circulating AGEs, and urinary AGEs were not independent predictors of DKD risk. Changes in eGFR should be monitored early, in addition to uACR, for determining DKD risk in type 1 diabetes.
Publisher: Elsevier BV
Date: 05-2014
DOI: 10.1016/J.ARCMED.2014.04.002
Abstract: Indoxyl sulfate (IS) and p-cresyl sulfate (PCS) are nephro- and cardiovascular toxins, produced solely by the gut microbiota, which have pro-inflammatory and pro-oxidative properties in vitro. We undertook this study to investigate the associations between IS and PCS and both inflammation and oxidative stress in the chronic kidney disease (CKD) population. In this cross-sectional observational cohort study, participants with stage 3-4 CKD who enrolled in a randomized controlled trial of cardiovascular risk modification underwent baseline measurements of serum total and free IS and PCS (measured by ultraperformance liquid chromotography), inflammatory markers (interferon gamma [IFN-γ], interleukin-6 [IL-6] and tumor necrosis factor-alpha [TNF-α]), antioxidant and oxidative stress markers (plasma glutathione peroxidase [GPx] activity, total antioxidant capacity [TAC] and F2-isoprostanes) and pulse wave velocity (PWV), a marker of arterial stiffness. There were 149 CKD patients (59% male age 60 ± 10 years 44% diabetic) with a mean eGFR of 40 ± 9 mL/min/1.73 m(2) (range 25-59). Serum free and total IS were independently associated with serum IL-6, TNF-α and IFN-γ, whereas serum free and total PCS were independently associated with serum IL-6 and PWV. Free IS and PCS were additionally independently associated with serum GPx but not with TAC or F2-isoprostanes. IS and PCS were associated with elevated levels of selected inflammatory markers and an antioxidant in CKD patients. PCS was also associated with increased arterial stiffness. Inflammation and oxidative stress may contribute to the nephro- and cardiovascular toxicities of IS and PCS. Intervention studies targeting production of IS and PCS by dietary manipulation and the subsequent effect on cardiovascular-related outcomes are warranted in the CKD population.
Publisher: Wiley
Date: 02-2000
Publisher: Oxford University Press (OUP)
Date: 10-2002
Abstract: Central venous catheters are frequently needed for the provision of haemodialysis, but their clinical usefulness is severely limited by infectious complications. The risk of such infections can be reduced by topical application of mupirocin to the exit sites of non-cuffed catheters or by the use of tunnelled, cuffed catheters. Whether mupirocin offers any additional protection against infection in patients with tunnelled, cuffed haemodialysis catheters has not been studied. An open-label, randomized controlled trial was performed comparing the effect of thrice-weekly exit site application of mupirocin (mupirocin group) vs no ointment (control group) on infection rates and catheter survival in patients receiving haemodialysis via a newly inserted, tunnelled, cuffed central venous catheter. All patients were followed until catheter removal and were monitored for the development of exit site infections and catheter-associated bacteraemias. Fifty patients were enrolled in the study. Both the mupirocin (n=27) and control (n=23) groups were similar at baseline with respect to demographic characteristics, comorbid illnesses and causes of renal failure. Compared with controls, mupirocin-treated patients experienced significantly fewer catheter-related bacteraemias (7 vs 35%, P<0.01) and a longer time to first bacteraemia (log rank score 8.68, P<0.01). The beneficial effect of mupirocin was entirely attributable to a reduction in staphylococcal infection (log rank 10.69, P=0.001) and was still observed when only patients without prior nasal Staphylococcus aureus carriage were included in the analysis (log rank score 6.33, P=0.01). Median catheter survival was also significantly longer in the mupirocin group (108 vs 31 days, log rank score 5.9, P<0.05). Mupirocin use was not associated with any adverse patient effects or the induction of antimicrobial resistance. Thrice-weekly application of mupirocin to tunnelled, cuffed haemodialysis catheter exit sites is associated with a marked reduction in line-related sepsis and a prolongation of catheter survival.
Publisher: Springer Science and Business Media LLC
Date: 10-03-2017
DOI: 10.1007/S10157-016-1255-Y
Abstract: It is well-established that uremic toxins are positively correlated with the risk of developing chronic kidney disease and cardiovascular disease. In addition, emerging data suggest that gut bacteria exert an influence over both the production of uremic toxins and the development of chronic kidney disease. As such, modifying the gut microbiota may have the potential as a treatment for chronic kidney disease. This is supported by data that suggest that rescuing microbiota dysbiosis may: reduce uremic toxin production prevent toxins and pathogens from crossing the intestinal barrier and, reduce gastrointestinal tract transit time allowing nutrients to reach the microbiota in the distal portion of the gastrointestinal tract. Despite emerging literature, the gut-kidney axis has yet to be fully explored. A special focus should be placed on examining clinically translatable strategies that might encourage improvements to the microbiome, thereby potentially reducing the risk of the development of chronic kidney disease. This review aims to present an overview of literature linking changes to the gastrointestinal tract with microbiota dysbiosis and the development and progression of chronic kidney disease.
Publisher: SAGE Publications
Date: 03-2019
Abstract: Peritoneal dialysis (PD) is a home-based therapy where nurses train patients in its use. There has been no published randomized controlled trial (RCT) evaluating any specific protocol for nurses delivering PD training. A standardized education package based upon the best available evidence and utilizing modern educational practices may lead to improved patient outcomes. The aim is to develop a standardized, evidence-based curriculum for PD trainers and patients aligned with guidelines from the International Society for Peritoneal Dialysis (ISPD), using best practice pedagogy. A literature search and clinical audit were conducted to identify current practice patterns and best practice. Results were reviewed by a focus group of practitioners comprising PD nurses, nephrologists, consumers, a medical education expert, and an eLearning expert. From this, a training curriculum and modules were developed. A comprehensive PD training curriculum has been developed, which includes modules for training PD nurses (trainers) and patient training manuals. The package comprises 2 introductory modules and 2 clinical case modules. The curriculum is designed for both interactive digital media (trainers) and traditional paper-based teaching with practical demonstrations (patients). Assessment is also addressed. The need for the development of a comprehensive and standardized curriculum for PD nurse trainers and their patients was confirmed. This paper outlines the process of the development of this curriculum. Pilot testing of the modules was launched in late 2017 to examine feasibility, and planning has commenced for a RCT in 2019 to investigate the effect of the modules on clinical outcomes, and their wider application across Australia and New Zealand.
Publisher: Elsevier BV
Date: 02-2001
DOI: 10.1016/S0002-9343(00)00695-1
Abstract: Atherosclerotic vascular disease is the main cause of morbidity and mortality in patients with end-stage renal disease, but the independent contribution of renal failure rather than associated risk factors is unclear. We sought to examine the relative contribution of these factors to the severity of atherosclerosis by measuring intima-medial thickness and brachial artery reactivity in uremic patients and controls. Cardiovascular risk factors, including lipid and homocysteine levels, were evaluated in 213 patients (69 on hemodialysis, 60 on peritoneal dialysis, and 82 nonuremic controls). High-resolution B-mode ultrasonography with automated off-line analysis was used to measure the intima-medial thickness in the common carotid artery and to measure the lumen diameter of the brachial artery at rest, during reactive hyperemia, and after sublingual nitroglycerine. The correlations of risk factors with intima-medial thickness and brachial reactivity were examined using a general linear regression model. Patients with renal failure had a greater mean (+/- SEM) maximum intima-medial thickness than controls (0.83 +/- 0.02 mm versus 0.70 +/- 0.02 mm, P 0.05). The uremic state was an independent predictor of intima-medial thickness (r2 = 0.16, P < 0.001) but not of brachial artery reactivity (P = 0.99). The atherosclerotic burden in patients with renal failure, as indicated by an increased intima-medial thickness, may reflect effects of uremia that are independent of cardiovascular risk factors.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 06-2010
DOI: 10.2215/CJN.09081209
Publisher: Springer Science and Business Media LLC
Date: 06-10-2021
Publisher: Oxford University Press (OUP)
Date: 23-09-2006
DOI: 10.1093/NDT/GFL543
Abstract: The activation of nuclear factor-kappaB (NF-kappaB) has been implicated in the development, progression and metastasis of renal cell carcinoma (RCC). This study investigates the effect of pyrrolidine dithiocarbamate (PDTC), a NF-kappaB inhibitor, on two metastatic human RCC cell lines, ACHN and SN12K1. RCC cell lines and normal cells were exposed to 25 or 50 microM of PDTC. Apoptosis was measured by flow cytometry and TdT-mediated nick end labelling methods. Cell viability and proliferation were measured by MTT and BrdU assays, respectively. Expression of NF-kappaB subunits, IkappaBs, IkappaB Kinase (IKK) complex and apoptotic regulatory proteins were analysed by western blotting and/or immunofluorescence. DNA-binding activity of NF-kappaB subunits were measured by ELISA. RCC cell lines had a higher basal level expression of all the five subunits of NF-kappaB than normal primary cultures of human proximal tubular epithelial cells or HK-2 cells. PDTC decreased the viability and proliferation of RCC, but not normal cells. Of the two RCC cell lines, ACHN had a higher basal level expression of all the five NF-kappaB subunits than SN12K1 and was more resistant to PDTC. While PDTC induced an overall decrease in expression of all the five NF-kappaB subunits in both RCC cell lines, unexpectedly, it increased the nuclear expression of NF-kappaB in ACHN, but not in SN12K1. PDTC reduced the DNA-binding activity of all the NF-kappaB subunits and the expression of the IKK complex (IKK-alpha, IKK-beta and IKK-gamma) and the inhibitory units IkappaB-alpha and IkappaB-beta. PDTC induced a significant increase in apoptosis in both RCC cell lines. This was associated with a decrease in expression of the anti-apoptotic proteins, Bcl-2 and Bcl-(XL), without marked changes in the pro-apoptotic protein Bax. These data suggest that PDTC has the potential to be an anticancer agent in some forms of RCC.
Publisher: Elsevier BV
Date: 2019
DOI: 10.1016/J.KINT.2018.08.036
Abstract: Reliable estimates of the long-term outcomes of acute kidney injury (AKI) are needed to inform clinical practice and guide allocation of health care resources. This systematic review and meta-analysis aimed to quantify the association between AKI and chronic kidney disease (CKD), end-stage kidney disease (ESKD), and death. Systematic searches were performed through EMBASE, MEDLINE, and grey literature sources to identify cohort studies in hospitalized adults that used standardized definitions for AKI, included a non-exposed comparator, and followed patients for at least 1 year. Risk of bias was assessed by the Newcastle-Ottawa Scale. Random effects meta-analyses were performed to pool risk estimates subgroup, sensitivity, and meta-regression analyses were used to investigate heterogeneity. Of 4973 citations, 82 studies (comprising 2,017,437 participants) were eligible for inclusion. Common sources of bias included incomplete reporting of outcome data, missing biochemical values, and inadequate adjustment for confounders. In iduals with AKI were at increased risk of new or progressive CKD (HR 2.67, 95% CI 1.99-3.58 17.76 versus 7.59 cases per 100 person-years), ESKD (HR 4.81, 95% CI 3.04-7.62 0.47 versus 0.08 cases per 100 person-years), and death (HR 1.80, 95% CI 1.61-2.02 13.19 versus 7.26 deaths per 100 person-years). A gradient of risk across increasing AKI stages was demonstrated for all outcomes. For mortality, the magnitude of risk was also modified by clinical setting, baseline kidney function, diabetes, and coronary heart disease. These findings establish the poor long-term outcomes of AKI while highlighting the importance of injury severity and clinical setting in the estimation of risk.
Publisher: SAGE Publications
Date: 17-12-2020
Abstract: While numerous studies have explored the patient experience of dialysis or other end-stage kidney disease (ESKD) treatments, few have explored the process of transitioning between dialysis modalities. This study aimed to develop an in-depth understanding of patient and caregiver perceptions and experiences of the transition from peritoneal to haemodialysis (HD) and to identify ways in which transitions can be optimised. Fifty-four in-depth, semi-structured interviews were undertaken at six study sites across the West Midlands, UK ( n = 23), and Queensland, Australia ( n = 31). Thirty-nine participants were patients with ESKD the remainder were family members. An inductive analytical approach was employed, with findings synthesised across sites to identify themes that transcended country differences. Of the 39 patient transitions, only 4 patients reported a wholly negative transition experience. Three cross-cutting themes identified common transition experiences and areas perceived to make a difference to the treatment transition: resistance to change and fear of HD transition experience shared with family and bodily adjustment and sense of self. Although each transition is unique to the in idual and their circumstances, kidney care services could optimise the process by recognising these patient-led themes and developing strategies that engage with them. Kidney care services should consider ways to keep patients aware of potential future treatment options and present them objectively. There is potential value in integrating expert support before and during treatment transitions to identify and address patient and family concerns.
Publisher: S. Karger AG
Date: 2021
DOI: 10.1159/000511848
Abstract: b i Background: /i /b Kidney disease is a major global public health problem, and laboratory testing of kidney health measures is essential for diagnosis and monitoring. The availability and affordability of kidney health laboratory tests across countries has not been systematically described. b i Methods: /i /b The International Society of Nephrology (ISN), in partnership with leaders of a Kidney Disease: Improving Global Outcomes (KDIGO) Controversies Conference, surveyed a representative subset of ISN-Global Kidney Health Atlas (ISN-GKHA) respondents from April to June 2020. We assessed the association between country gross national income (GNI) per capita and laboratory testing availability and affordability. b i Results: /i /b Of 33 regional expert nephrologists invited, 24 (73%) responded, representing all 10 ISN regions around the world. Availability of kidney health laboratory tests was as follows: serum Cr (100%), serum cystatin C (67%), urine albumin (96%), urine Cr (100%), and dipstick urinalysis (100%). Median (IQR) reimbursement values in international dollars were as follows: serum Cr Int$ 6.61 (3.42–8.84), serum cystatin C Int$ 31.51 (17.36–46.25), urine albumin Int$ 10.22 (5.90–15.42), urine Cr Int$ 7.50 (1.66–8.84), and dipstick urinalysis Int$ 6.26 (2.56–8.40). Reimbursement values did not differ significantly by World Bank income group or by GNI per capita. b i Conclusion: /i /b There was widespread availability of kidney health laboratory tests and substantial variation in reimbursement values. To achieve meaningful progress across nations in mitigating the growth of kidney disease, access to affordable diagnostic technology is essential. Our results are highly relevant to policymakers and researchers as countries increasingly consider national strategies for kidney disease detection and management.
Publisher: MDPI AG
Date: 14-07-2021
DOI: 10.3390/IJMS22147532
Abstract: Coagulopathies common to patients with diabetes and chronic kidney disease (CKD) are not fully understood. Fibrin deposits in the kidney suggest the local presence of clotting factors including tissue factor (TF). In this study, we investigated the effect of glucose availability on the synthesis of TF by cultured human kidney tubular epithelial cells (HTECs) in response to activation of protease-activated receptor 2 (PAR2). PAR2 activation by peptide 2f-LIGRLO-NH2 (2F, 2 µM) enhanced the synthesis and secretion of active TF (~45 kDa) which was blocked by a PAR2 antagonist (I-191). Treatment with 2F also significantly increased the consumption of glucose from the cell medium and lactate secretion. Culturing HTECs in 25 mM glucose enhanced TF synthesis and secretion over 5 mM glucose, while addition of 5 mM 2-deoxyglucose (2DOG) significantly decreased TF synthesis and reduced its molecular weight (~40 kDa). Blocking glycosylation with tunicamycin also reduced 2F-induced TF synthesis while reducing its molecular weight (~36 kDa). In conclusion, PAR2-induced TF synthesis in HTECs is enhanced by culture in high concentrations of glucose and suppressed by inhibiting either PAR2 activation (I-191), glycolysis (2DOG) or glycosylation (tunicamycin). These results may help explain how elevated concentrations of glucose promote clotting abnormities in diabetic kidney disease. The application of PAR2 antagonists to treat CKD should be investigated further.
Publisher: Elsevier BV
Date: 03-2021
Publisher: SAGE Publications
Date: 05-2017
Abstract: Prevention of exit-site infection (ESI) is of paramount importance to peritoneal dialysis (PD) patients. The aim of this study was to evaluate the effectiveness of chlorhexidine in the prevention of ESI in incident PD patients compared with mupirocin. This retrospective, pre-test ost-test observational study included all incident PD patients at Singapore General Hospital from 2012 to 2015. Patients received daily topical exit-site application of either mupirocin (2012 – 2013) or chlorhexidine (2014 – 2015) in addition to routine exit-site cleaning with 10% povidone-iodine. The primary outcome was ESI rate during the 2 time periods. Secondary outcomes were peritonitis rate, times to first ESI and peritonitis, hospitalization rate, and infection-related catheter removal. Event rates were analyzed using Poisson regression, and infection-free survival was estimated using Kaplan-Meier and Cox regression survival analyses. The study included 162 patients in the mupirocin period (follow-up 141.5 patient-years) and 175 patients in the chlorhexidine period (follow-up 136.9 patient-years). Compared with mupirocin-treated patients, chlorhexidine-treated patients experienced more frequent ESIs (0.22 vs 0.12 episodes atient-year, p = 0.048), although this was no longer statistically significant following multivariable analysis (incidence rate ratio [IRR] 1.78, 95% confidence interval [CI] 0.98 – 3.26, p = 0.06). No significant differences were observed between the 2 groups with respect to time to first ESI ( p = 0.10), peritonitis rate ( p = 0.95), time to first peritonitis ( p = 0.60), hospitalization rate ( p = 0.21) or catheter removal rate (0.03 vs 0.04 atient-year, p = 0.56). Topical exit-site application of chlorhexidine cream was associated with a borderline significant, higher rate of ESI in incident PD patients compared with mupirocin cream.
Publisher: Oxford University Press (OUP)
Date: 07-1999
Publisher: Wiley
Date: 27-05-2013
DOI: 10.1111/NEP.12095
Publisher: Elsevier BV
Date: 06-2019
Publisher: SAGE Publications
Date: 11-2011
Abstract: The number of elderly patients with end-stage kidney disease (ESKD) is increasing worldwide, but the proportion of elderly patients commencing peritoneal dialysis (PD) is falling. The reluctance of elderly ESKD patients to consider PD may be related to a perception that PD is associated with greater rates of complications. In the present study, we compared outcomes between younger and older PD patients. Using Australia and New Zealand Dialysis Registry data, all adult ESKD patients commencing PD between 1991 and 2007 were categorized into under 50, 50 – 64.9, and 65 years of age or older groups. Time to first peritonitis, death-censored technique failure, and peritonitis-associated and all-cause mortality were evaluated by multivariate Cox proportional hazards model analysis. Of the 12932 PD patients included in the study, 3370 (26%) were under 50 years of age, 4386 (34%) were 50 – 64.9 years of age, and 5176 (40%) were 65 years of age or older. Compared with younger patients ( years), elderly patients (≥65 years) had a similar peritonitis-free survival and a lower risk of death-censored technique failure [hazard ratio (HR): 0.85 95% confidence interval (CI): 0.79 to 0.93], but they had higher peritonitis-related (HR: 2.31 95% CI: 1.68 to 3.18) and all-cause mortality (HR: 2.90 95% CI: 2.60 to 3.23). Not unexpectedly, elderly patients have higher peritonitis-related and all-cause mortality, which is likely a consequence of a greater prevalence of comorbid disease. However, compared with younger patients, elderly patients have superior technique survival and similar peritonitis-free survival, suggesting that PD is a viable renal replacement therapy in this group of patients.
Publisher: Wiley
Date: 10-01-2011
Publisher: John Wiley & Sons, Ltd
Date: 15-04-2009
Publisher: Wiley
Date: 11-09-2013
Publisher: John Wiley & Sons, Ltd
Date: 08-07-2004
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 06-2004
DOI: 10.1097/00007691-200406000-00011
Abstract: The current approach for therapeutic drug monitoring in renal transplant recipients receiving mycophenolate mofetil (MMF) is measurement of total mycophenolic acid (MPA) concentration. Because MPA is highly bound, during hypoalbuminemia the total concentration no longer reflects the free (pharmacologically active) concentration. The authors investigated what degree of hypoalbuminemia causes a significant change in protein binding and thus percentage free MPA. Forty-two renal transplant recipients were recruited for the study. Free and total concentrations of MPA (predose, and 1, 3, and 6 hours post-MMF dose s les) and plasma albumin concentrations were determined on day 5 posttransplantation. Six-hour area under the concentration-time curve (AUC(0-6)) values were calculated for free and total MPA, and percentage free MPA was determined for each patient. The authors found a significant relationship between low albumin concentrations and increased percentage free MPA (Spearman correlation = -0.54, P or = 3%) in this patient population was 31 g/L. At this cutoff value albumin was found to be a good predictor of altered free MPA percentage, with a sensitivity and specificity of 0.75 and 0.80, respectively, and an area under the ROC curve of 0.79. To rationalize MMF dosing regimens in hypoalbuminemic patients (plasma albumin < or = 31 g/L), clinicians should consider monitoring the free MPA concentration.
Publisher: Frontiers Media SA
Date: 29-01-2021
DOI: 10.3389/FPHAR.2020.627185
Abstract: Chinese herbal medicine (CHM) might have benefits in patients with non-diabetic chronic kidney disease (CKD), but there is a lack of high-quality evidence, especially in CKD4. This study aimed to assess the efficacy and safety of Bupi Yishen Formula (BYF) vs. losartan in patients with non-diabetic CKD4. This trial was a multicenter, double-blind, double-dummy, randomized controlled trial that was carried out from 11-08-2011 to 07-20-2015. Patients were assigned (1:1) to receive either BYF or losartan for 48 weeks. The primary outcome was the change in the slope of the estimated glomerular filtration rate (eGFR) over 48 weeks. The secondary outcomes were the composite of end-stage kidney disease, death, doubling of serum creatinine, stroke, and cardiovascular events. A total of 567 patients were randomized to BYF ( n = 283) or losartan ( n = 284) of these, 549 (97%) patients were included in the final analysis. The BYF group had a slower renal function decline particularly prior to 12 weeks over the 48-week duration (between-group mean difference of eGFR slopes: −2.25 ml/min/1.73 m 2 /year, 95% confidence interval [CI]: −4.03,−0.47), and a lower risk of composite outcome of death from any cause, doubling of serum creatinine level, end-stage kidney disease (ESKD), stroke, or cardiovascular events (adjusted hazard ratio = 0.61, 95%CI: 0.44,0.85). No significant between-group differences were observed in the incidence of adverse events. We conclude that BYF might have renoprotective effects among non-diabetic patients with CKD4 in the first 12 weeks and over 48 weeks, but longer follow-up is required to evaluate the long-term effects. Clinical Trial Registration: www.chictr.org.cn , identifier ChiCTR-TRC-10001518.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 04-2013
Publisher: Elsevier BV
Date: 03-2018
DOI: 10.1053/J.AJKD.2017.09.018
Abstract: Many randomized controlled trials have been performed with the goal of improving outcomes related to hemodialysis vascular access. If the reported outcomes are relevant and measured consistently to allow comparison of interventions across trials, such trials can inform decision making. This study aimed to assess the scope and consistency of vascular access outcomes reported in contemporary hemodialysis trials. Systematic review. Adults requiring maintenance hemodialysis. All randomized controlled trials and trial protocols reporting vascular access outcomes identified from ClinicalTrials.gov, Embase, MEDLINE, and the Cochrane Kidney and Transplant Specialized Register from January 2011 to June 2016. Any hemodialysis-related intervention. The frequency and characteristics of vascular access outcome measures were analyzed and classified. From 168 relevant trials, 1,426 access-related outcome measures were extracted and classified into 23 different outcomes. The 3 most common outcomes were function (136 [81%] trials), infection (63 [38%]), and maturation (31 [18%]). Function was measured in 489 different ways, but most frequently reported as "mean access blood flow (mL/min)" (37 [27%] trials) and "number of thromboses" (30 [22%]). Infection was assessed in 136 different ways, with "number of access-related infections" being the most common measure. Maturation was assessed in 44 different ways at 15 different time points and most commonly characterized by vein diameter and blood flow. Patient-reported outcomes, including pain (19 [11%]) and quality of life (5 [3%]), were reported infrequently. Only a minority of trials used previously standardized outcome definitions. Restricted s ling frame for feasibility and focus on contemporary trials. The reporting of access outcomes in hemodialysis trials is very heterogeneous, with limited patient-reported outcomes and infrequent use of standardized outcome measures. Efforts to standardize outcome reporting for vascular access are critical to optimizing the comparability, reliability, and value of trial evidence to improve outcomes for patients requiring hemodialysis.
Publisher: Springer Science and Business Media LLC
Date: 08-11-2018
DOI: 10.1007/S00394-017-1568-Y
Abstract: The gut-liver interaction suggests that modification of gut bacterial flora using probiotics and synbiotics may improve liver function. This systematic review and meta-analysis aimed to clarify the effect of probiotics and synbiotics consumption on the serum concentration of liver function enzymes. PubMed (MEDLINE), Cumulative Index to Nursing and Allied Health Literature, and Cochrane Library (Central) were searched from 1980 to August 2017 for studies where adults consumed probiotics and/or synbiotics in controlled trials and changes in liver function enzymes were examined. A total of 17 studies (19 trials) were included in the meta-analysis. Random effects meta-analyses were applied. Probiotics and synbiotics significantly reduced serum alanine aminotransferase [- 8.05 IU/L, 95% confidence interval (CI) - 13.07 to - 3.04 p = 0.002] aspartate aminotransferase (- 7.79 IU/L, 95% CI: - 13.93 to - 1.65 p = 0.02) and gamma-glutamyl transpeptidase (- 8.40 IU/L, 95% CI - 12.61 to - 4.20 p < 0.001). Changes in the serum concentration of alkaline phosphatase and albumin did not reach a statistically significant level. Changes to bilirubin levels were in favour of the control group (0.95 μmol/L, 95% CI 0.48-1.42 p < 0.001). Subgroup analysis suggested the existence of liver disease at baseline, synbiotics supplementation and duration of supplementation ≥ 8 weeks resulted in more pronounced improvement in liver function enzymes than their counterparts. Probiotics and synbiotics may be suggested as supplements to improve serum concentration of liver enzymes, especially when synbiotics administered for a period ≥ 8 weeks and in in iduals with liver disease.
Publisher: No publisher found
Date: 2011
Publisher: Oxford University Press (OUP)
Date: 08-2009
Publisher: Elsevier BV
Date: 04-2015
DOI: 10.1053/J.AJKD.2014.09.012
Abstract: Managing the complex fluid and diet requirements of chronic kidney disease (CKD) is challenging for patients. We aimed to summarize patients' perspectives of dietary and fluid management in CKD to inform clinical practice and research. Systematic review of qualitative studies. Adults with CKD who express opinions about dietary and fluid management. MEDLINE, EMBASE, PsycINFO, CINAHL, Google Scholar, reference lists, and PhD dissertations were searched to May 2013. Thematic synthesis. We included 46 studies involving 816 patients living in middle- to high-income countries. Studies involved patients treated with facility-based and home hemodialysis (33 studies 462 patients), peritoneal dialysis (10 studies 112 patients), either hemodialysis or peritoneal dialysis (3 studies 73 patients), kidney transplant recipients (9 studies 89 patients), and patients with non-dialysis-dependent CKD stages 1 to 5 (5 studies 80 patients). Five major themes were identified: preserving relationships (interference with roles, social limitations, and being a burden), navigating change (feeling deprived, disrupting held truths, breaking habits and norms, being overwhelmed by information, questioning efficacy, and negotiating priorities), fighting temptation (resisting impositions, experiencing mental invasion, and withstanding physiologic needs), optimizing health (accepting responsibility, valuing self-management, preventing disease progression, and preparing for and protecting a transplant), and becoming empowered (comprehending paradoxes, finding solutions, and mastering change and demands). Limited data in non-English languages and low-income settings and for adults with CKD not treated with hemodialysis. Dietary and fluid restrictions are disorienting and an intense burden for patients with CKD. Patient-prioritized education strategies, harnessing patients' motivation to stay well for a transplant or to avoid dialysis, and viewing adaptation to restrictions as a collaborative journey are suggested strategies to help patients adjust to dietary regimens in order to reduce their impact on quality of life.
Publisher: Wiley
Date: 12-2001
Publisher: Oxford University Press (OUP)
Date: 23-06-2015
DOI: 10.1093/NDT/GFU222
Abstract: Conscientious integration of the best available evidence in the care of an in idual patient could be challenging for a busy clinician. A well-conducted systematic review can adequately inform not only the clinicians, but also the policy makers and researchers about the benefits and risks of a particular intervention. In this article, we describe how to critically appraise the methods and interpret the results of a systematic review of interventional trials and apply the findings of a systematic review to the clinical questions.
Publisher: Oxford University Press (OUP)
Date: 15-09-2014
DOI: 10.1093/NDT/GFT378
Abstract: Non-randomized studies suggest an association between serum uric acid levels and progression of chronic kidney disease (CKD). The aim of this systematic review is to summarize evidence from randomized controlled trials (RCTs) concerning the benefits and risks of uric acid-lowering therapy on renal outcomes. Medline, Excerpta Medical Database and Cochrane Central Register of Controlled Trials were searched with English language restriction for RCTs comparing the effect of uric acid-lowering therapy with placebo/no treatment on renal outcomes. Treatment effects were summarized using random-effects meta-analysis. Eight trials (476 participants) evaluating allopurinol treatment were eligible for inclusion. There was substantial heterogeneity in baseline kidney function, cause of CKD and duration of follow-up across these studies. In five trials, there was no significant difference in change in glomerular filtration rate from baseline between the allopurinol and control arms [mean difference (MD) 3.1 mL/min/1.73 m2, 95% confidence intervals (CI) -0.9, 7.1 heterogeneity χ2=1.9, I2=0%, P=0.75]. In three trials, allopurinol treatment abrogated increases in serum creatinine from baseline (MD -0.4 mg/dL, 95% CI -0.8, -0.0 mg/dL heterogeneity χ2=3, I2=34%, P=0.22). Allopurinol had no effect on proteinuria and blood pressure. Data for effects of allopurinol therapy on progression to end-stage kidney disease and death were scant. Allopurinol had uncertain effects on the risks of adverse events. Uric acid-lowering therapy with allopurinol may retard the progression of CKD. However, adequately powered randomized trials are required to evaluate the benefits and risks of uric acid-lowering therapy in CKD.
Publisher: Elsevier BV
Date: 07-2007
DOI: 10.1016/J.TRSL.2007.01.006
Abstract: Progressive renal fibrosis is an unwanted and limiting side effect of cancer treatments, whether they are systemic (for ex le, chemotherapy), local (for ex le, radiotherapy), or total body irradiation for allogenic bone marrow transplants. The relative roles of macrophages, myofibroblasts, and lymphocytes and the apoptotic deletion of renal functional or inflammatory cell populations in the pathogenesis of renal fibrosis are yet unclear. In this study, rat models of 2 renal cancer treatments: cis-platinum-(II)-diammine dichloride (cisplatin, 6-mg/kg body weight) and radiation (single dose of 20 Gy) were used. Kidneys were analyzed 4 days to 3 months after treatment. The extent of renal fibrosis was compared with number and localization of chronic inflammatory cell populations, cell death (apoptosis and necrosis), and expression and localization of profibrotic growth factors transforming growth factor-beta1 (TGF-beta1) and tumor necrosis factor-alpha (TNF-alpha). The models provided contrasting rates of fibrogenesis: After cisplatin, development of fibrosis was rapid and extensive (up to 50% fibrosis at 3 months) in comparison, radiation-induced fibrosis was slowly progressive (approximately 10% fibrosis at 3 months). The extent of fibrosis was associated spatially and temporally with increasing numbers of myofibroblasts with TGF-beta1 or macrophages with TNF-alpha. Tubular epithelial apoptosis was highest with high TNF-alpha (P<0.05). A significant inverse correlation existed between extent of tubulointerstitial fibrosis and interstitial cell apoptosis for cisplatin and a similar nonsignificant result for radiation (r(2)=0.8671 for cisplatin, P<0.05 r(2)=0.2935 for radiation, NS). The latter result suggests a role for inflammatory cell apoptosis in minimizing development of renal fibrosis.
Publisher: Oxford University Press (OUP)
Date: 07-03-2011
DOI: 10.1093/NDT/GFR070
Abstract: The number of indigenous patients with end-stage kidney disease (ESKD) is increasing in Australia, reflecting a similar trend in other countries. Because many indigenous patients live in remote areas, peritoneal dialysis (PD) is often preferred. Compared to non-indigenous PD patients, indigenous patients have increased complication rates but the effect of residential locations on outcomes remains unclear. The aim of this study is to examine the association between race and PD outcomes stratified by location. Using the Australia and New Zealand Dialysis and Transplant (ANZDATA) Registry, all adult ESKD patients commencing PD in Australia between 1995 and 2008 were included. Patients were stratified as non-indigenous or indigenous race and were grouped according to their residential location, the latter stratified into metropolitan, regional and remote areas. Outcomes evaluated included peritonitis, technique failure, peritonitis-related and all-cause mortality. Regional and/or remote PD patients generally have a greater risk peritonitis-related complications and/or mortality compared to metropolitan patients. However, remote indigenous PD patients had the greatest risk of all PD-related complications, including all-cause and peritonitis-related mortality. This registry analysis demonstrates that non-metropolitan PD patients, especially remote indigenous patients, have higher complication rates, suggesting that environmental factors are important in determining PD outcomes.
Publisher: Wiley
Date: 26-04-2014
Publisher: Springer Science and Business Media LLC
Date: 21-11-2019
DOI: 10.1186/S12882-019-1605-6
Abstract: Membranoproliferative glomerulonephritis (MPGN) is an uncommon cause of end stage kidney disease (ESKD) and the clinical outcomes of patients with MPGN who commence kidney replacement therapy have not been comprehensively studied. All adult patients with ESKD due to glomerulonephritis commencing kidney replacement therapy in Australia and New Zealand from January 1, 1996 to December 31, 2016 were reviewed. Patients with ESKD due to MPGN were compared to patients with other forms of glomerulonephritis. Patient survival on dialysis and following kidney transplantation, kidney recovery on dialysis, time to transplantation, allograft survival, death-censored allograft survival and disease recurrence post-transplant were compared between the two groups using Kaplan Meier survival curves and Cox proportional hazards regression. Of 56,481 patients included, 456 (0.8%) had MPGN and 12,660 (22.4%) had another form of glomerulonephritis. Five-year patient survival on dialysis and following kidney transplantation were similar between patients with ESKD from MPGN and other forms of glomerulonephritis (Dialysis: 59% vs. 62% p = 0.61 Transplant: 93% vs. 93%, p = 0.49). Compared to patients with other forms of glomerulonephritis, patients with MPGN had significantly poorer 5-year allograft survival (70% vs. 81% respectively, p = 0.02) and death censored allograft survival (74% vs. 87%, respectively p 0.01). The risk of disease recurrence was significantly higher in patients with MPGN compared to patients with other glomerulonephritidites (18% vs. 5% p 0.01). In patients with MPGN who had allograft loss, patients with MPGN recurrence had a significantly shorter time to allograft loss compared to patients with MPGN who had allograft loss due to any other cause (median time to allograft loss 3.2 years vs. 4.4 years, p 0.01). Compared with other forms of glomerulonephritis, patients with MPGN experienced comparable rates of survival on dialysis and following kidney transplantation, but significantly higher rates of allograft loss due to disease recurrence.
Publisher: John Wiley & Sons, Ltd
Date: 23-01-2008
Publisher: Bentham Science Publishers Ltd.
Date: 02-2010
Publisher: Elsevier BV
Date: 09-2020
Publisher: Springer Science and Business Media LLC
Date: 06-1992
DOI: 10.2165/00003088-199222060-00003
Abstract: Renal insufficiency is characterised by impaired host defences, which are compromised further by each of the 3 modes of renal replacement--haemodialysis, continuous ambulatory peritoneal dialysis (CAPD) and renal transplantation. Reduced renal clearance of unknown toxins, possible development of nutritional deficiencies and administration of immunosuppressive medications lead to aberrant immune regulation early in the course of renal failure. This results subsequently in increased frequency and severity of infection. Vaccination plays an important role in attenuating this infection risk, but impaired cell-mediated and humoral immunity contraindicates the use of live vaccines and engenders suboptimal and short-lived antibody responses to inactivated vaccines. Reinforced vaccination schedules, increased vaccine dosage and concomitantly administered adjuvant immunomodulators have variably improved the defective antibody responses to certain vaccines. Immunisation against hepatitis B virus has resulted in a significant decrease in prevalence and incidence of this infection in haemodialysis units. Similarly, the inoculation of influenza vaccine in patients with uraemia and of polyvalent pneumococcal vaccine in special risk circumstances has been recommended because of perceived reductions in morbidity and mortality from infection with these agents. Cytomegalovirus (CMV) vaccine may attenuate CMV disease severity in recipients of renal allografts. Staphylococcus aureus vaccine, on the other hand, is ineffective in preventing peritonitis or exit site infections in patients receiving CAPD. Other killed vaccines have not been comprehensively studied, but generally have the same indications for use as in normal in iduals. However, the protection that these vaccines afford may be either inadequate or transient, so that other infection control strategies should be simultaneously implemented.
Publisher: SAGE Publications
Date: 09-2018
Abstract: There is a growing, global burden of patients with end-stage kidney disease (ESKD) requiring renal replacement therapy. Although peritoneal dialysis (PD) is considered to be the most cost-effective dialysis modality, its utilization has been declining in some regions. The first year after starting PD is thought to be a vulnerable period for technique failure, which in turn contributes to poor patient retention. Improved understanding of the risk factors for technique failure during this period may help the development of targeted strategies to lower its incidence and improve both the utilization and utility of PD. This up-to-date review will summarize current evidence regarding the definition, incidence, causes, and predictors of early PD technique failure. Promising avenues for directing future research efforts will also be discussed.
Publisher: Informa UK Limited
Date: 28-09-2016
DOI: 10.1080/17446651.2016.1239527
Abstract: Bone disorders in chronic kidney disease (CKD) are associated with heightened risks of fractures, vascular calcification, poor quality of life and mortality compared to the general population. However, diagnosis and management of these disorders in CKD are complex and appreciably limited by current diagnostic modalities. Areas covered: Bone histomorphometry remains the gold standard for diagnosis but is not widely utilised and lacks feasibility as a monitoring tool. In practice, non-invasive imaging and biochemical markers are preferred to guide therapeutic decisions. Expert commentary: This review aims to summarize the risk factors for, and spectrum of bone disease in CKD, as well as appraise the clinical utility of dual energy X-ray densitometry, peripheral quantitative computed tomography, high-resolution peripheral quantitative computed tomography, and bone turnover markers.
Publisher: Wiley
Date: 16-12-2017
Publisher: Wiley
Date: 02-2005
DOI: 10.1111/J.1440-1797.2005.00361.X
Abstract: Icodextrin is a starch-derived glucose polymer that causes sustained ultrafiltration in long dwells in peritoneal dialysis. The aim of this study was to assess factors that were predictive of an increment in ultrafiltration following the introduction of icodextrin in patients with refractory fluid overload. Thirty-nine patients (20 male/19 female, mean age 57.7 +/- 2.4 years) on peritoneal dialysis were enrolled in a prospective pretest ost-test, open-label study. All patients had symptomatic fluid overload refractory to fluid restriction (<800 mL/day), frusemide doses of 250 mg or more daily, optimization of dwell time and use of hypertonic dextrose. An icodextrin exchange was substituted for a 4.25% dextrose exchange for the long-dwell period. After 1 month, median (interquartile range) 24 h ultrafiltration volume increased by 500 mL (interquartile range: 50-1000). An increase in ultrafiltration volume correlated positively with the dialyate : plasma creatinine ratio at 4 h (r = 0.498, P = 0.001) and negatively with the ratio of dialysate glucose concentrations at 4 and 0 h (r = -0.464, P = 0.003). On multivariate regression analysis, high transporter status was predictive of a greater ultrafiltration response to icodextrin relative to dextrose peritoneal dialysis exchanges. Age, sex, race, peritoneal dialysis duration, peritoneal dialysis modality, diabetes mellitus, baseline albumin, and baseline ultrafiltration volume were not significantly correlated with the change in ultrafiltration volume. Icodextrin significantly augments ultrafiltration volumes in patients with refractory fluid overload. A high peritoneal membrane transporter status is the best predictor of a favourable ultrafiltration response to icodextrin.
Publisher: American Chemical Society (ACS)
Date: 07-04-2011
DOI: 10.1021/ES104384M
Abstract: Results from a systematic investigation of mercury (Hg) concentrations across 14 forest sites in the United States show highest concentrations in litter layers, strongly enriched in Hg compared to aboveground tissues and indicative of substantial postdepositional sorption of Hg. Soil Hg concentrations were lower than in litter, with highest concentrations in surface soils. Aboveground tissues showed no detectable spatial patterns, likely due to 17 different tree species present across sites. Litter and soil Hg concentrations positively correlated with carbon (C), latitude, precipitation, and clay (in soil), which together explained up to 94% of concentration variability. We observed strong latitudinal increases in Hg in soils and litter, in contrast to inverse latitudinal gradients of atmospheric deposition measures. Soil and litter Hg concentrations were closely linked to C contents, consistent with well-known associations between organic matter and Hg, and we propose that C also shapes distribution of Hg in forests at continental scales. The consistent link between C and Hg distribution may reflect a long-term legacy whereby old, C-rich soil and litter layers sequester atmospheric Hg depositions over long time periods. Based on a multiregression model, we present a distribution map of Hg concentrations in surface soils of the United States.
Publisher: S. Karger AG
Date: 2015
DOI: 10.1159/000440815
Abstract: b i Background: /i /b Fibrillary glomerulonephritis (FGN) and immunotactoid glomerulopathy (IG) are uncommon and characterised by non-amyloid fibrillary glomerular deposits. The aim of this study was to investigate characteristics and outcomes of patients undergoing renal replacement therapy (RRT) for end-stage kidney disease (ESKD) secondary to FGN and IG. b i Methods: /i /b All ESKD patients who commenced RRT in Australia and New Zealand 1 January 1990 to 31 December 2010 were included. Outcomes were assessed by Kaplan-Meier, multivariate logistic-regression analysis and multivariable Cox proportional-hazards survival analysis. b i Results: /i /b Of 45,216 in iduals with ESKD, 55 (0.12%) had FGN and 11 (0.02%) had IG. The median survival of FGN patients on dialysis (5.63 years, 95% CI 3.31-7.96) was not significantly different from patients with other ESKD causes (median 4.01 years, 95% CI 4.34-4.47 log-rank 1.32, p = 0.25), but was significantly longer than that of IG patients (median 2.93 years, 95% CI 0.00-6.17 log-rank 4.8, p = 0.03). Thirteen (24%) FGN patients received 13 renal-allografts, 4 (36%) IG patients received 4 renal-allografts and 11,528 (26%) other ESKD patients received 12,278 renal-allografts. FGN patients experienced comparable outcomes to other ESKD patients for both 10-year patient survival (100 vs. 84%, p = 0.93) and renal-allograft survival (67 vs. 76%, p = 0.06). For IG, the median follow-up was 3.66 years with 75% patient survival and 100% renal-allograft survival. One (8%) FGN patient and 1 (25%) IG patient experienced recurrent FGN and IG respectively in their allograft. b i Conclusion: /i /b Patients with FGN have comparable dialysis and renal transplant outcomes to patients with other causes of ESKD. IG patients have inferior survival on dialysis, although renal transplant outcomes are acceptable. Disease recurrence in renal-allografts was low for both FGN and IG.
Publisher: SAGE Publications
Date: 03-2016
Abstract: The longitudinal trends of lipid parameters and the impact of biocompatible peritoneal dialysis (PD) solutions on these levels remain to be fully defined. The present study aimed to a) evaluate the influence of neutral pH, low glucose degradation product (GDP) PD solutions on serum lipid parameters, and b) explore the capacity of lipid parameters (total cholesterol [TC], triglyceride [TG], high density lipoprotein [HDL], TC/HDL, low density lipoprotein [LDL], very low density lipoprotein [VLDL]) to predict cardiovascular events (CVE) and mortality in PD patients. The study included 175 incident participants from the balANZ trial with at least 1 stored serum s le. A composite CVE score was used as a primary clinical outcome measure. Multilevel linear regression and Poisson regression models were fitted to describe the trend of lipid parameters over time and its ability to predict composite CVE, respectively. Small but statistically significant increases in serum TG (coefficient 0.006, p 0.001), TC/HDL (coefficient 0.004, p = 0.001), and VLDL cholesterol (coefficient 0.005, p = 0.001) levels and a decrease in the serum HDL cholesterol levels (coefficient -0.004, p = 0.009) were observed with longer time on PD, whilst the type of PD solution (biocompatible vs standard) received had no significant effect on these levels. Peritoneal dialysis glucose exposure was significantly associated with trends in TG, TC/HDL, HDL and VLDL levels. Baseline lipid parameter levels were not predictive of composite CVEs or all-cause mortality. Serum TG, TC/HDL, and VLDL levels increased and the serum HDL levels decreased with increasing PD duration. None of the lipid parameters were significantly modified by biocompatible PD solution use over the time period studied or predictive of composite CVE or mortality.
Publisher: Informa UK Limited
Date: 04-2012
DOI: 10.4161/AUTO.19496
Publisher: SAGE Publications
Date: 07-2017
Abstract: Significant interest in the practice of urgent-start peritoneal dialysis (PD) is mounting internationally, with several observational studies supporting the safety, efficacy, and feasibility of this approach. However, little is known about the early complication rates and long-term technique and peritonitis-free survival for patients who start PD urgently (i.e. within 2 weeks of catheter insertion), compared to those with a conventional start. This single-center, matched case-control study evaluated patients commencing PD between 2010 and 2015. Urgent-start PD patients were matched 1:3 with conventional-start PD controls based on diabetic status and age. The primary outcomes were early complications, both following catheter insertion and PD commencement (within 4 weeks). Secondary outcomes included technique and peritonitis-free survival. A total of 104 patients (26 urgent-start, 78 conventional-start) were included. Urgent-start patients were more likely to be referred late, initiate PD in hospital, and be prescribed lower initial exchange volumes ( p 0.01). They experienced more frequent leaks post-catheter insertion (12% vs 1%, p = 0.047) and more frequent catheter migration following commencement of PD (12% vs 1%, p = 0.047). There were no significant differences in the rates of overall or infectious complications. Kaplan-Meier estimates of technique survival and time to first episode of peritonitis were comparable between the groups. Compared with conventional-start PD, urgent-start PD has acceptably low early complication rates and similar long-term technique survival. Urgent-start PD appears to be a safe way to initiate urgent renal replacement therapy in patients without established dialysis access.
Publisher: Elsevier BV
Date: 09-2004
Publisher: Wiley
Date: 08-2005
DOI: 10.1111/J.1440-1797.2005.00406.X
Abstract: Obesity is a frequent and important consideration to be taken into account when assessing patient suitability for renal transplantation. In addition, posttransplant obesity continues to represent a significant challenge to health care professionals caring for renal transplant recipients. Despite the vast amount of evidence that exists on the effect of pretransplant obesity on renal transplant outcomes, there are still conflicting views regarding whether obese renal transplant recipients have a worse outcome, in terms of short- and long-term graft survival and patient survival, compared with their non-obese counterparts. It is well established that any association of obesity with reduced patient survival in renal transplant recipients is mediated in part by its clustering with traditional cardiovascular risk factors such as hypertension, dyslipidaemia, insulin resistance and posttransplant diabetes mellitus, but what is not understood is what mediates the association of obesity with graft failure. Whether it is the higher incidence of cardiovascular comorbidities jeopardising the graft or factors specific to obesity, such as hyperfiltration and glomerulopathy, that might be implicated, currently remains unknown. It can be concluded, however, that pre- and posttransplant obesity should be targeted as aggressively as the more well-established cardiovascular risk factors in order to optimize long-term renal transplant outcomes.
Publisher: SAGE Publications
Date: 17-01-2020
Abstract: Icodextrin is a high molecular weight, starch-derived glucose polymer that is used as an osmotic agent in peritoneal dialysis (PD) to promote ultrafiltration. There has been wide variation in its use across Australia and the rest of the world, but it is unclear whether these differences are due to patient- or centre-related factors. Using the Australia and New Zealand Dialysis and Transplant Registry, all adult patients ( years) who started PD in Australia between 1 January 2007 and 31 December 2014 were included. The primary outcome was icodextrin use at PD commencement. Hierarchical logistic regression clustered around the treatment centre was applied to determine the patient- and centre-related characteristics associated with icodextrin use. The impact of centre-level practice pattern variability on icodextrin uptake was estimated using the intra-cluster correlation coefficient (ICC). Of 5948 patients starting on PD in 58 centres during the study period, 2002 (33.7%) received icodextrin from the outset. Overall uptake of icodextrin increased from 29% in 2010 to 42.5% in 2014. Patient-level characteristics associated with an increased likelihood of commencing PD with icodextrin included male sex (adjusted odds ratio (OR) 1.55, 95% confidence interval (CI) 1.35–1.77 p 0.001), prior haemodialysis or kidney transplantation (OR 1.26, 95% CI 1.09–1.47), obesity (OR 1.66, 95% CI 1.41–1.96), diabetes mellitus (OR 2.32, 95% CI 2.03–2.64) and residing in a postcode with the highest decile of socio-economic status (OR 1.43, 95% CI 1.11–1.85). The centre-level characteristic associated with an increased likelihood of commencing PD with icodextrin was routine assessment of a peritoneal equilibration test (OR 1.45, 95% CI 1.27–1.66). Centres with fewer patients on automated peritoneal dialysis (APD) were less likely to start on icodextrin (APD proportion % OR 0.45, 95% CI 0.20–0.99). Centre factors accounted for 25% of the variation in icodextrin use solution among incident PD patients (ICC 0.25). Icodextrin use in incident Australian PD patients is increasing variable and associated with both patient and centre characteristics. Centre-related factors explained 25% of variability in icodextrin use.
Publisher: SAGE Publications
Date: 09-2004
DOI: 10.1177/089686080402400511
Abstract: The aim of this study was to prospectively evaluate the ability of a peritoneal equilibration test (PET) performed in the first week of peritoneal dialysis (PD) to predict subsequent transport status, as determined by a PET at 4 weeks and year after PD commencement. Prospective observational study of an incident PD cohort at a single center. Tertiary-care institutional dialysis center. The study included 50 consecutive patients commencing PD at the Princess Alexandra Hospital between 25/2/2001 and 14/5/2003 (mean age 60.9 ± 12.2 years, 54% male, 92% Caucasian, 38% diabetic). All patients were initially prescribed continuous ambulatory PD. Measurements performed during paired PETs included dialysate-to-plasma ratios of urea (D/P urea) and creatinine (D/P creatinine) at 4 hours, the ratio of dialysate glucose concentrations at 0 and 4 hours (D/D 0 glucose), and drain volumes at 4 hours. When paired 1-week and 1-month PET data were analyzed, significant changes were observed in measured D/P urea (0.91 ± 0.07 vs 0.94 ± 0.07 respectively p 0.05), D/P creatinine (0.55 ± 0.12 vs 0.66 ± 0.11, p 0.001), and D/D 0 glucose (0.38 ± 0.08 vs 0.36 ± 0.10, p 0.05). Using Bland–Altman analysis, the repeatability coefficients were 0.17, 0.20, and 0.13, respectively. Agreement between 1-week and 1-month PET measurements with respect to peritoneal transport category was moderate for D/D 0 glucose (weighted κ 0.52), but poor for D/P urea (0.30), D/P creatinine (0.35), and drain volumes (0.20). The PET measurements performed more than 1 year following PD commencement ( n = 28) generally agreed closely with 1-month measurements, and poorly with 1-week measurements. Peritoneal transport characteristics change significantly within the first month of PD. PETs carried out during this time should be considered preliminary and should be confirmed by a PET 4 weeks later. Nevertheless, performing an early D/D 0 glucose measurement at 1 week predicted ultimate transport status sufficiently well to facilitate early clinical decision-making about optimal PD modality while patients were still receiving PD training. On the other hand, the widespread practice of using measured drain volumes in the first week to predict ultimate transport category is highly inaccurate and not recommended.
Publisher: Wiley
Date: 23-02-2012
DOI: 10.1111/J.1440-1797.2011.01560.X
Abstract: The aim of this study was to develop a limited s ling strategy (LSS) for the simultaneous estimation of exposure to tacrolimus, mycophenolic acid and unbound prednisolone in adult kidney transplant recipients. Tacrolimus, mycophenolic acid and unbound prednisolone area under the concentration-time curve profiles from 0 to 12 h post dose (AUC(0-12)) were collected from 20 subjects. Multiple linear regression analyses were performed to develop a LSS enabling the simultaneous estimation of exposure to all three drugs. Median percentage prediction error and median absolute percentage prediction error were calculated via jackknife analysis to evaluate bias and imprecision. LSS showed superior ability to predict exposure compared with single concentration-time points. A LSS incorporating concentration measurements at 0.5 h (C(0.5)), 2 h (C(2)) and 4 h (C(4)) post dose displayed acceptable predictive ability for all three drugs. This LSS may serve as a useful research tool for further investigation of the utility of concentration monitoring of these medications.
Publisher: Elsevier BV
Date: 12-2018
Publisher: Wiley
Date: 06-08-2007
Publisher: Oxford University Press (OUP)
Date: 09-11-2006
DOI: 10.1093/NDT/GFI248
Abstract: There is limited information about the outcomes of patients commencing peritoneal dialysis (PD) after failed kidney transplantation. The aim of the present study was to compare patient survival, death-censored technique survival and peritonitis-free survival between patients initiating PD after failed renal allografts and those after failed native kidneys. The study included all patients from the ANZDATA Registry who started PD between April 1, 1991 and March 31, 2004. Times to death, death-censored technique failure and first peritonitis episode were examined by multivariate Cox proportional hazards models. For all outcomes, conditional risk set models were utilized for the multiple failure data, and analyses were stratified by failure order. Standard errors were calculated by using robust variance estimation for the cluster-correlated data. In total, 13,947 episodes of PD were recorded in 23,579 person-years. Of these, 309 PD episodes were started after allograft failure. Compared with PD patients who had never undergone kidney transplantation, those with failed renal allografts were more likely to be younger, Caucasian, New Zealand residents and life-long non-smokers with lower body mass index (BMI), poorer initial renal function and a longer period from commencement of the first renal replacement therapy to PD. On multivariate analysis, PD patients with failed kidney transplants had comparable patient mortality [weighted hazards ratio (HR) 1.09, 95% confidence interval (CI) 0.81-1.45, P = 0.582], death-censored technique failure (adjusted HR 0.91, 95% CI 0.75-1.10, P = 0.315) and peritonitis-free survival (adjusted HR 0.92, 95% CI 0.72-1.16, P = 0.444) with those PD patients who had failed native kidneys. Similar findings were observed in a subset of patients (n = 5496) for whom peritoneal transport status was known and included in the models as a covariate. Patients commencing PD after renal allograft failure experienced outcomes comparable with those with failed native kidneys. PD appears to be a viable option for patients with failed kidney allografts.
Publisher: SAGE Publications
Date: 11-2019
Abstract: The optimal treatment for managing anemia in peritoneal dialysis (PD) patients and best clinical practices are not completely understood. We sought to characterize international variations in anemia measures and management among PD patients. The Peritoneal Dialysis Outcomes and Practice Patterns Study (PDOPPS) enrolled adult PD patients from 6 countries from 2014 to 2017. Hemoglobin (Hb), ferritin levels, and transferrin saturation (TSAT), as well as erythropoiesis stimulating agents (ESAs) and iron use were compared cross-sectionally at study enrollment in Australia and New Zealand (A/NZ), Canada, Japan, the United Kingdom (UK), and the United States (US). Among 3,603 PD patients from 193 facilities, mean Hb ranged from 11.0 – 11.3 g/dL across countries. The majority of patients (range 53% – 59%) had Hb 10 – 11.9 g/dL, with 4% – 12% patients ≥ 13 g/dL and 16% – 23% 10 g/dL. Use of ESAs was higher in Japan (94% of patients) than elsewhere (66% – 79% of patients). In the US, 63% of patients had a ferritin level 500 ng/mL, compared with 5% – 38% in other countries. In the US and Japan, 87% – 89% of PD patients had TSAT ≥ 20%, compared with 73% – 76% in other countries. Intravenous (IV) iron use within 4 months of enrollment was higher in the US (55% of patients) than elsewhere (6% – 17% patients). In this largest international observational study of anemia and anemia management in patients receiving PD, comparable Hb levels across countries were observed but with notable differences in ESA and iron use. Peritoneal dialysis patients in the US have higher ferritin levels and higher IV iron use than other countries.
Publisher: Wiley
Date: 25-05-2007
DOI: 10.1111/J.1440-1797.2007.00810.X
Abstract: Approximately 5-10% of patients with chronic kidney disease demonstrate hyporesponsiveness to erythropoiesis-stimulating agents (ESA), defined as a continued need for greater than 300 IU/kg per week erythropoietin or 1.5 mug/kg per week darbepoetin administered by the subcutaneous route. Such hyporesponsiveness contributes significantly to morbidity, mortality and health-care economic burden in chronic kidney disease and represents an important diagnostic and management challenge. The commonest causes of ESA resistance are non-compliance, absolute or functional iron deficiency and inflammation. It is widely accepted that maintaining adequate iron stores, ideally by administering iron parenterally, is the most important strategy for reducing the requirements for, and enhancing the efficacy of ESA. There have been recent epidemiologic studies linking parenteral iron therapy to an increased risk of infection and atherosclerosis, although other investigations have refuted this. Inflammatory ESA hyporesponsiveness has been reported to be improved by a number of interventions, including the use of biocompatible membranes, ultrapure dialysate, transplant nephrectomy, ascorbic acid therapy, vitamin E supplementation, statins and oxpentifylline administration. Other variably well-established causes of ESA hyporesponsiveness include inadequate dialysis, hyperparathyroidism, nutrient deficiencies (vitamin B12, folate, vitamin C, carnitine), angiotensin-converting enzyme inhibitors, angiotensin receptor blockers, aluminium overload, antibody-mediated pure red cell aplasia, primary bone marrow disorders, myelosuppressive agents, haemoglobinopathies, haemolysis and hypersplenism. This paper reviews the causes of ESA hyporesponsiveness and the clinical evidence for proposed therapeutic interventions. A practical algorithm for approaching the investigation and management of patients with ESA hyporesponsiveness is also provided.
Publisher: Public Library of Science (PLoS)
Date: 21-06-2019
Publisher: Springer Science and Business Media LLC
Date: 30-05-2023
DOI: 10.1186/S13063-023-07363-4
Abstract: An increasing number of older people are living with chronic kidney disease (CKD). Many have complex healthcare needs and are at risk of deteriorating health and functional status, which can adversely affect their quality of life. Comprehensive geriatric assessment (CGA) is an effective intervention to improve survival and independence of older people, but its clinical utility and cost-effectiveness in frail older people living with CKD is unknown. The GOAL Trial is a pragmatic, multi-centre, open-label, superiority, cluster randomised controlled trial developed by consumers, clinicians, and researchers. It has a two-arm design, CGA compared with standard care, with 1:1 allocation of a total of 16 clusters. Within each cluster, study participants ≥ 65 years of age (or ≥ 55 years if Aboriginal or Torres Strait Islander (First Nations Australians)) with CKD stage 3–5/5D who are frail, measured by a Frailty Index (FI) of 0.25, are recruited. Participants in intervention clusters receive a CGA by a geriatrician to identify medical, social, and functional needs, optimise medication prescribing, and arrange multidisciplinary referral if required. Those in standard care clusters receive usual care. The primary outcome is attainment of self-identified goals assessed by standardised Goal Attainment Scaling (GAS) at 3 months. Secondary outcomes include GAS at 6 and 12 months, quality of life (EQ-5D-5L), frailty (Frailty Index – Short Form), transfer to residential aged care facilities, cost-effectiveness, and safety (cause-specific hospitalisations, mortality). A process evaluation will be conducted in parallel with the trial including whether the intervention was delivered as intended, any issue or local barriers to intervention delivery, and perceptions of the intervention by participants. The trial has 90% power to detect a clinically meaningful mean difference in GAS of 10 units. This trial addresses patient-prioritised outcomes. It will be conducted, disseminated and implemented by clinicians and researchers in partnership with consumers. If CGA is found to have clinical and cost-effectiveness for frail older people with CKD, the intervention framework could be embedded into routine clinical practice. The implementation of the trial’s findings will be supported by presentations at conferences and forums with clinicians and consumers at specifically convened workshops, to enable rapid adoption into practice and policy for both nephrology and geriatric disciplines. It has potential to materially advance patient-centred care and improve clinical and patient-reported outcomes (including quality of life) for frail older people living with CKD. ClinicalTrials.gov NCT04538157. Registered on 3 September 2020.
Publisher: Oxford University Press (OUP)
Date: 20-11-2021
Abstract: The condition onset flag (COF) variable was introduced into the hospitalization coding practice in 2008 to help distinguish between the new and pre-existing conditions. However, Australian datasets collected prior to 2008 lack the COF, potentially leading to data waste. The aim of this study was to determine if an algorithm to lookback across the previous admissions could make this distinction. All patients requiring kidney replacement therapy (KRT) identified in the Australia and New Zealand Dialysis and Transplant Registry in New South Wales, South Australia and Tasmania between July 2008 and December 2015 were linked with hospital admission datasets using probabilistic linkage. Three different lookback periods entailing either one, two or three admissions prior to the index admission were investigated. Conditions identified in an index admission but not in the lookback periods were classified as a new-onset condition. Conditions identified in both the index admission and the lookback period were deemed to be pre-existing. The degrees of agreement were determined using the kappa statistic. Conditions examined for new onset were myocardial infarction, pulmonary embolism and pneumonia. Conditions examined for prior existence were diabetes mellitus, hypertension and kidney failure. Secondary analyses evaluated whether the conditions identified as pre-existing using COF were captured consistently in the subsequent admissions. 11 140 patients on KRT with 69 403 admissions were analysed. Lookback over a single admission interval (Period 1) provided the highest rates of true positives with COF for all three new-onset conditions, ranging from 89% to 100%. The levels of agreement were almost perfect for all conditions (k = 0.94–1.00). This was consistent across the different time eras. All lookback periods identified additional new-onset conditions that were not classified by COF: Lookback Period 1 picked up a further 474 myocardial infarction, 84 pulmonary embolism and 1092 pneumonia episodes. Lookback Period 1 had the highest percentage of true positives when identifying the pre-existing conditions (64–80%). The level of agreement was moderate to strong and was similar across the time eras. Secondary analysis showed that not all pre-existing conditions identified using COF carried forward to the subsequent admission (61–82%) but increased when looking forward across & admission (87–95%). The described algorithm using a lookback period is a pragmatic, reliable and robust means of identifying the new-onset and pre-existing patient conditions, thereby enriching the existing datasets predating the availability of the COF. The findings also highlight the value of concatenating a series of hospital patient admissions to more comprehensively adjudicate the pre-existing conditions, rather than assessing the index admission alone.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 27-11-2003
Publisher: Wiley
Date: 10-01-2022
DOI: 10.1111/PETR.14224
Abstract: Data on sex-based disparities in children with kidney failure and outcomes after kidney transplantation are relatively sparse. This study examined the association between sex differences and the odds of receiving a pre-emptive living donor kidney transplantation, and post-transplant outcomes in children and adolescents. We studied all patients (aged <20 years) who commenced kidney replacement therapy (KRT) between 2002 and 2017 using data from the ANZDATA Registry. Factors associated with graft loss and acute rejection after transplantation were assessed using multivariable Cox regression model. Differences in the odds of receiving a pre-emptive live donor transplant between sexes were assessed using adjusted logistic regression. Of the 757 children transplanted during the study period, 497 (65.7%) received a live donor kidney (163, 21.5% pre-emptive). In total, 168 (22.2%) patients experienced graft loss and 213 (28.1%) patients experienced a first episode of acute rejection during the median follow-up period of 6.9 years (IQR 3.5-11.5 years). There were no differences in the rates of graft loss or acute rejection by sex. Compared with boys, the adjusted hazard ratios (aHR) (95% confidence interval) for graft loss and acute rejection in girls were 0.97 (0.71-1.33) and 1.09 (0.82-1.44), respectively. Among children who received living donor kidney transplants, there were no sex differences in the odds of receiving a pre-emptive transplant (adjusted odds ratio (aOR) 0.90 (95% CI 0.56-1.45)). No sex differences were observed in the odds of receiving a pre-emptive living donor kidney transplant or outcomes after kidney transplantation.
Publisher: BMJ
Date: 2012
Publisher: Oxford University Press (OUP)
Date: 29-12-2016
DOI: 10.1093/NDT/GFV413
Abstract: Oral disease is a potentially treatable determinant of mortality and quality of life. No comprehensive multinational study to quantify oral disease burden and to identify candidate preventative strategies has been performed in the dialysis setting. The ORAL disease in hemoDialysis (ORALD) study was a prospective study in adults treated with hemodialysis in Europe (France, Hungary, Italy, Poland, Portugal and Spain) and Argentina. Oral disease was assessed using standardized WHO methods. Participants self-reported oral health practices and symptoms. Sociodemographic and clinical factors associated with oral diseases were determined and assessed within nation states. Of 4726 eligible adults, 4205 (88.9%) participated. Overall, 20.6% were edentulous [95% confidence interval (CI), 19.4-21.8]. Participants had on average 22 (95% CI 21.7-22.2) decayed, missing or filled teeth, while moderate to severe periodontitis affected 40.6% (95% CI 38.9-42.3). Oral disease patterns varied markedly across countries, independent of participant demographics, comorbidity and health practices. Participants in Spain, Poland, Italy and Hungary had the highest mean adjusted odds of edentulousness (2.31, 1.90, 1.90 and 1.54, respectively), while those in Poland, Hungary, Spain and Argentina had the highest odds of ≥14 decayed, missing or filled teeth (23.2, 12.5, 8.14 and 5.23, respectively). Compared with Argentina, adjusted odds ratios for periodontitis were 58.8, 58.3, 27.7, 12.1 and 6.30 for Portugal, Italy, Hungary, France and Poland, respectively. National levels of tobacco consumption, diabetes and child poverty were associated with edentulousness within countries. Oral disease in adults on hemodialysis is very common, frequently severe and highly variable among countries, with much of the variability unexplained by participant characteristics or healthcare. Given the national variation and high burden of disease, strategies to improve oral health in hemodialysis patients will require implementation at a country level rather than at the level of in iduals.
Publisher: SAGE Publications
Date: 05-2005
Publisher: Wiley
Date: 02-2000
Publisher: Public Library of Science (PLoS)
Date: 24-05-2022
DOI: 10.1371/JOURNAL.PONE.0268823
Abstract: We sought to evaluate the predictors and outcomes of mold peritonitis in patients with peritoneal dialysis (PD). This cohort study included PD patients from the MycoPDICS database who had fungal peritonitis between July 2015-June 2020. Patient outcomes were analyzed by Kaplan Meier curves and the Log-rank test. Multivariable Cox proportional hazards model regression was used to estimating associations between fungal types and patients’ outcomes. The study included 304 fungal peritonitis episodes (yeasts n = 129, hyaline molds n = 122, non-hyaline molds n = 44, and mixed fungi n = 9) in 303 patients. Fungal infections were common during the wet season ( p .001). Mold peritonitis was significantly more frequent in patients with higher hemoglobin levels, presentations with catheter problems, and positive galactomannan (a fungal cell wall component) tests. Patient survival rates were lowest for non-hyaline mold peritonitis. A higher hazard of death was significantly associated with leaving the catheter in-situ (adjusted hazard ratio [HR] = 6.15, 95%confidence interval [CI]: 2.86–13.23) or delaying catheter removal after the diagnosis of fungal peritonitis (HR = 1.56, 95%CI: 1.00–2.44), as well as not receiving antifungal treatment (HR = 2.23, 95%CI: 1.25–4.01) or receiving it for less than 2 weeks (HR = 2.13, 95%CI: 1.33–3.43). Each additional day of antifungal therapy beyond the minimum 14-day duration was associated with a 2% lower risk of death (HR = 0.98, 95%CI: 0.95–0.999). Non-hyaline-mold peritonitis had worse survival. Longer duration and higher daily dosage of antifungal treatment were associated with better survival. Deviations from the 2016 ISPD Peritonitis Guideline recommendations concerning treatment duration and catheter removal timing were independently associated with higher mortality.
Publisher: SAGE Publications
Date: 05-2016
Abstract: Extending technique survival on peritoneal dialysis (PD) remains a major challenge in optimizing outcomes for PD patients while increasing PD utilization. The primary objective of the Peritoneal Dialysis Outcomes and Practice Patterns Study (PDOPPS) is to identify modifiable practices associated with improvements in PD technique and patient survival. In collaboration with the International Society for Peritoneal Dialysis (ISPD), PDOPPS seeks to standardize PD-related data definitions and provide a forum for effective international collaborative clinical research in PD. The PDOPPS is an international prospective cohort study in Australia, Canada, Japan, the United Kingdom (UK), and the United States (US). Each country is enrolling a random s le of incident and prevalent patients from national s les of 20 to 80 sites with at least 20 patients on PD. Enrolled patients will be followed over an initial 3-year study period. Demographic, comorbidity, and treatment-related variables, and patient-reported data, will be collected over the study course. The primary outcome will be all-cause PD technique failure or death other outcomes will include cause-specific technique failure, hospitalizations, and patient-reported outcomes. A high proportion of the targeted number of study sites has been recruited to date in each country. Several ancillary studies have been funded with high momentum toward expansion to new countries and additional participation. The PDOPPS is the first large, international study to follow PD patients longitudinally to capture clinical practice. With data collected, the study will serve as an invaluable resource and research platform for the international PD community, and provide a means to understand variation in PD practices and outcomes, to identify optimal practices, and to ultimately improve outcomes for PD patients.
Publisher: Elsevier BV
Date: 05-2015
DOI: 10.1053/J.AJKD.2014.12.017
Abstract: Biocompatible solutions may lower peritonitis rates, but are more costly than conventional solutions. The aim of the present study was to assess the additional costs and health outcomes of biocompatible over conventional solutions in incident peritoneal dialysis patients to guide practice decisions. Secondary economic evaluation of a randomized controlled trial. 185 participants in the balANZ trial. Cost-effectiveness of biocompatible compared to standard solution over the 2 years using an Australian health care funder perspective. Intervention group received biocompatible solutions and control group received standard solutions over 2 years. Costs included dialysis charges, costs of treating peritonitis, non-peritonitis-related hospital stays, and medication. Peritonitis was the health outcome of interest incremental cost-effectiveness ratios were reported in terms of the additional cost per additional patient avoiding peritonitis at 2 years. Mean total per-patient costs were A$57,451 and A$53,930 for the biocompatible and standard-solution groups, respectively. The base-case analysis indicated an incremental cost of A$17,804 per additional patient avoiding peritonitis at 2 years for biocompatible compared to standard solution. In a sensitivity analysis excluding extreme outliers for non-peritonitis-related hospitalizations, mean per-patient costs were A$49,159 and A$52,009 for the biocompatible and standard-solution groups, respectively. Consequently, the incremental cost-effectiveness ratio also was reduced significantly: biocompatible solution became both less costly and more effective than standard solution and, in economic terms, was dominant over standard solution. Peritonitis was a secondary outcome of the balANZ trial. Health outcomes measured only in terms of patients avoiding peritonitis over 2 years may underestimate the longer term benefits (eg, prolonged technique survival). Biocompatible dialysis solutions may offer a cost-effective alternative to standard solutions for peritoneal dialysis patients. Reductions in peritonitis-related hospital costs may offset the higher costs of biocompatible solution.
Publisher: BMJ
Date: 2013
Publisher: Oxford University Press (OUP)
Date: 02-04-2020
DOI: 10.1093/NDT/GFAA044
Abstract: Slow recruitment and poor retention jeopardize the reliability and statistical power of clinical trials, delaying access to effective interventions and increasing costs, as commonly observed in nephrology trials. Involving patients in trial design, recruitment and retention is infrequent but potentially transformational. We conducted three workshops involving 105 patients/caregivers and 43 health professionals discussing patient recruitment and retention in clinical trials in chronic kidney disease. We identified four themes. ‘Navigating the unknown’—patients described being unaware of the research question, confused by technical terms, sceptical about findings and feared the risk of harm. ‘Wary of added burden’—patients voiced reluctance to attend additional appointments, were unsure of the commitment required or at times felt too unwell and without capacity to participate. ‘Disillusioned and disconnected’—some patients felt they were taken for granted, particularly if they did not receive trial results. Participants believed there was no culture of trial participation in kidney disease and an overall lack of awareness about opportunities to participate. To improve recruitment and retention, participants addressed ‘Building motivation and interest’. Investigators should establish research consciousness from the time of diagnosis, consider optimal timing for approaching patients, provide comprehensive information in an accessible manner, emphasize current and future relevance to them and their illness, involve trusted clinicians in recruitment and minimize the burden of trial participation. Participation in clinical trials was seen as an opportunity for people to give back to the health system and for future people in their predicament.
Publisher: Wiley
Date: 2005
DOI: 10.1111/J.1442-2042.2004.00993.X
Abstract: The purpose of the present paper was to describe the pattern of expression of insulin-like growth factor (IGF-I) and its regulatory binding proteins (IGFBP) in renal cell carcinoma (RCC). The expressions of mRNA and protein for various IGF members were assessed in 24 paired normal and malignant human renal tissues (16 clear cell and 8 papillary RCC) using semiquantitative reverse transcription-polymerase chain reaction and immunohistochemistry. Paired tissue s les were also obtained from six patients with oncocytoma in order to compare the specificity of changes in IGF/IGFBP expression between tumors derived from proximal (RCC) and distal (oncocytoma) tubular epithelium. Clear cell RCC were characterized by significant increases in the mRNA expression of IGF-I, IGFBP-3 and IGFBP-6 while papillary RCC exhibited down-regulated expression of IGF-I, IGFBP-4 and IGFBP-5. The IGFBP-2, IGFBP-4 and IGFBP-5 mRNA were down-regulated in oncocytomas. Semiquantitative assessment of immunohistochemical staining demonstrated significant increases in epithelial associated IGF-I and IGFBP-3 in clear cell RCC, increased IGFBP-5 protein in papillary RCC and no significant changes in IGF/IGFBP protein expression in oncocytoma. The expression of IGF-I and certain IGFBP is significantly altered in RCC compared with normal renal tissue and oncocytomas. This altered expression is differentially regulated according to the histologic subtype of RCC, and suggests that the IGF/IGFBP axis may play an important role in determining the malignant phenotype of RCC.
Publisher: Springer Science and Business Media LLC
Date: 21-01-2014
Abstract: A new haemodialysis catheter-care procedure has been reported, including exit-site disinfection with chlorhexidine gluconate that results in a sustained reduction in bacteraemia rates, new intravenous antibiotic starts and sepsis-associated and access-associated hospitalization rates compared with standard care. These findings have potential implications for the prevention of haemodialysis catheter-associated infections.
Publisher: SAGE Publications
Date: 07-2004
DOI: 10.1177/089686080402400408
Abstract: The aim of the present investigation was to examine the association between body mass index (BMI) and peritonitis rates among incident peritoneal dialysis (PD) patients in a large cohort with long-term follow-up. Retrospective observational cohort study of the Australian and New Zealand PD patient population. Australia and New Zealand Dialysis and Transplant (ANZDATA) Registry. The study included all incident adult patients ( n = 10 709) who received PD in Australia and New Zealand in the 12-year period between 1 April 1991 and 31 March 2003. Patients were classified as obese (BMI ≥ 30 kg/m 2 ), overweight (BMI 25.0 – 29.9 kg/m 2 ), normal weight (20 – 24.9 kg/m 2 ), or underweight ( 20 kg/m 2 ). Time to first peritonitis and episodes of peritonitis per patient-year were recorded over the 12-year period. Higher BMI was associated with a shorter time to first peritonitis episode, independent of other risk factors [hazard ratio 1.08 for each 5-kg/m 2 increase in BMI, 95% confidence interval (CI) 1.04 – 1.12, p 0.001]. When peritonitis outcomes were analyzed as episodes of peritonitis per patient-year, these rates were significantly higher among patients with higher BMI: underweight 0.69 episodes/year (95% CI 0.66 – 0.73), normal weight 0.79 (95% CI 0.77 – 0.81), overweight 0.88 (95% CI 0.85 – 0.90), obese 1.06 (95% CI 1.02 – 1.09). Coronary artery disease and chronic lung disease were associated with both shorter time to first peritonitis and higher peritonitis rates, independently of these other factors. There was also a “vintage effect,” with lower peritonitis rates seen among people who commenced dialysis in more recent years. Higher BMI at the commencement of renal replacement therapy is a significant risk factor for peritonitis. The mechanisms for this remain undefined.
Publisher: Elsevier BV
Date: 05-2017
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 12-11-2018
Abstract: The comparative effectiveness of treatment with angiotensin-converting enzyme (ACE) inhibitors, angiotensin receptor blockers (ARBs), or their combination in people with albuminuria and cardiovascular risk factors is unclear. In a multicenter, randomized, open label, blinded end point trial, we evaluated the effectiveness on cardiovascular events of ACE or ARB monotherapy or combination therapy, targeting BP /80 in patients with moderate or severe albuminuria and diabetes or other cardiovascular risk factors. End points included a primary composite of cardiovascular death, nonfatal myocardial infarction, nonfatal stroke, and hospitalization for cardiovascular causes and a revised end point of all-cause mortality. Additional end points included ESRD, doubling of serum creatinine, albuminuria, eGFR, BP, and adverse events. Because of slow enrollment, the trial was modified and stopped 41% short of targeted enrollment of 2100 participants, corresponding to 35% power to detect a 25% reduced risk in the primary outcome. Our analysis included 1243 adults, with median follow-up of 2.7 years. Efficacy outcomes were similar between groups (ACE inhibitor versus ARB, ACE inhibitor versus combination, ARB versus combination) as were rates of serious adverse events. The rate of permanent discontinuation for ARB monotherapy (6.3%) was significantly lower than for ACE inhibitor monotherapy (15.7%) or combined therapy (18.3%). Patients may tolerate ARB monotherapy better than ACE inhibitor monotherapy. However, data from this trial and similar trials, although as yet inconclusive, show no trend suggesting differences in mortality and renal outcomes with ACE inhibitors or ARBs as dual or monotherapy in patients with albuminuria and diabetes or other cardiovascular risk factors.
Publisher: Elsevier BV
Date: 02-2021
Publisher: AME Publishing Company
Date: 10-2017
Publisher: AMPCo
Date: 08-2005
DOI: 10.5694/J.1326-5377.2005.TB06958.X
Abstract: The systematic staging of chronic kidney disease (CKD) by glomerular filtration measurement and proteinuria has allowed the development of rational and appropriate management plans. One of the barriers to early detection of CKD is the lack of a precise, reliable and consistent measure of kidney function. The most common measure of kidney function is currently serum creatinine concentration. It varies with age, sex, muscle mass and diet, and interlaboratory variation between measurements is as high as 20%. The reference interval for serum creatinine concentration includes up to 25% of people (particularly thin, elderly women) who have an estimated glomerular filtration rate (eGFR) that is significantly reduced ( 60 mL/min/1.73 m2", rather than as a precise figure.
Publisher: SAGE Publications
Date: 05-2010
Abstract: Staphylococcus aureus peritonitis is a serious complication of peritoneal dialysis (PD). Since reports of the course and treatment of S. aureus peritonitis have generally been limited to small, single-center studies, the aim of the current investigation was to examine the frequency, predictors, treatment, and clinical outcomes of this condition in all 4675 patients receiving PD in Australia between 1 October 2003 and 31 December 2006. 3594 episodes of peritonitis occurred in 1984 patients and 503 (14%) episodes of S. aureus peritonitis occurred in 355 (8%) in iduals. 273 (77%) patients experienced 1 episode of S. aureus peritonitis, 52 (15%) experienced 2 episodes, 19 (5%) experienced 3 episodes, and 11 (3%) experienced 4 or more episodes. The predominant antibiotics used as initial empiric therapy were vancomycin (61%) and cephazolin (31%). Once S. aureus was isolated and identified, the prescription of vancomycin did not appreciably change for methicillin-sensitive S. aureus (MSSA) peritonitis (59%) and increased for methicillin-resistant S. aureus (MRSA) peritonitis (84%). S. aureus peritonitis was associated with a higher rate of relapse than non-S. aureus peritonitis (20% vs 13%, p 0.001) but comparable rates of hospitalization (67% vs 70%, p = 0.2), catheter removal (23% vs 21%, p = 0.4), hemodialysis transfer (18% vs 18%, p = 0.6), and death (2.2% vs 2.3%, p = 0.9). MRSA peritonitis was independently predictive of an increased risk of permanent hemodialysis transfer [odds ratio (OR) 2.11, 95% confidence interval (CI) 1.17 – 3.82] and tended to be associated with an increased risk of hospitalization (OR 2.00, 95% CI 0.96 – 4.19). The initial empiric antibiotic choice between vancomycin and cephazolin was not significantly associated with clinical outcomes, but serious adverse outcomes were more likely if vancomycin was not used for subsequent treatment of MRSA peritonitis. In conclusion, S. aureus peritonitis is a serious complication of PD, involves a small proportion of patients, and is associated with a high rate of relapse and repeat episodes. Other adverse clinical outcomes are similar to those for peritonitis overall but are significantly worse for MRSA peritonitis. Empiric initial therapy with either vancomycin or cephazolin results in comparable outcomes, provided vancomycin is prescribed when MRSA is isolated and identified.
Publisher: Elsevier BV
Date: 03-2020
Publisher: Bentham Science Publishers Ltd.
Date: 11-2008
Publisher: SAGE Publications
Date: 2013
Abstract: The impact of climatic variations on peritoneal dialysis (PD)–related peritonitis has not been studied in detail. The aim of the current study was to determine whether various climatic zones influenced the probability of occurrence or the clinical outcomes of peritonitis. Using ANZDATA registry data, the study in cluded all Australian patients receiving PD between 1 October 2003 and 31 December 2008. Climatic regions were defined according to the Köppen classification. The overall peritonitis rate was 0.59 episodes per patient–year. Most of the patients lived in Temperate regions (65%), with others residing in Subtropical (26%), Tropical (6%), and Other climatic regions (Desert, 0.6% Grassland, 2.3%). Compared with patients in Temperate regions, those in Tropical regions demonstrated significantly higher overall peritonitis rates and a shorter time to a first peritonitis episode [adjusted hazard ratio: 1.15 95% confidence interval (CI): 1.01 to 1.31]. Culture-negative peritonitis was significantly less likely in Tropical regions [adjusted odds ratio (OR): 0.42 95% CI: 0.25 to 0.73] its occurrence in Subtropical and Other regions was comparable to that in Temperate regions. Fungal peritonitis was independently associated with Tropical regions (OR: 2.18 95% CI: 1.22 to 3.90) and Other regions (OR: 3.46 95% CI: 1.73 to 6.91), where rates of antifungal prophylaxis were also lower. Outcomes after first peritonitis episodes were comparable in all groups. Tropical regions were associated with a higher overall peritonitis rate (including fungal peritonitis) and a shorter time to a first peritonitis episode. Augmented peritonitis prophylactic measures such as antifungal therapy and exit-site care should be considered in PD patients residing in Tropical climates.
Publisher: Oxford University Press (OUP)
Date: 07-10-2015
DOI: 10.1093/NDT/GFU313
Abstract: The use of peritoneal dialysis (PD) varies widely from country to country, with the main limitation being infectious complications, particularly peritonitis, which leads to technique failure, hospitalization and increased mortality. A large number of prophylactic strategies have been employed to reduce the occurrence of peritonitis, including the use of oral, nasal and topical antibiotics, disinfection of the exit site, modification of the transfer set used in continuous ambulatory PD exchanges, changes to the design of the PD catheter implanted, the surgical method by which the PD catheter is inserted, the type and length of training given to patients, the occurrence of home visits by trained PD nurses, the use of antibiotic prophylaxis in patients undergoing certain invasive procedures and the administration of antifungal prophylaxis to PD patients whenever they are given an antibiotic treatment course. This review summarizes the existing evidence evaluating these interventions to prevent exit-site/tunnel infections and peritonitis.
Publisher: Oxford University Press (OUP)
Date: 31-08-2012
DOI: 10.1093/NDT/GFS135
Abstract: People with chronic kidney disease (CKD) have a high symptom burden and experience poorer quality of life than the general population. People with CKD frequently report fatigue, anorexia, pain, sleep disturbance, itching and restless legs. Depression and sexual dysfunction may also be common in CKD, although questions about optimal diagnosis and treatment remain unanswered. People with kidney disease identify lifestyle and the impact of CKD on family and psychosocial supports as key priorities and rate symptoms such as sexual dysfunction and psychological distress as severe. Here, we outline the current state of research underlying depression and sexual dysfunction in this population focusing on prevalence, diagnosis, screening, outcomes and interventions and suggest areas requiring additional specific research.
Publisher: Springer Science and Business Media LLC
Date: 04-07-2014
Publisher: Wiley
Date: 26-10-2018
Publisher: Wiley
Date: 27-03-2014
Publisher: Wiley
Date: 19-04-2013
DOI: 10.1111/NEP.12052
Publisher: Oxford University Press (OUP)
Date: 25-11-2013
DOI: 10.1093/NDT/GFS492
Abstract: There are few reports regarding the long-term renal replacement therapy (RRT) outcomes of amyloidosis. In this retrospective, multi-centre, multi-country registry analysis, all patients with and without amyloidosis who commenced RRT for end-stage renal failure (ESRF) in Australia and New Zealand between 1963 and 2010 were included. Of 58 422 patients who underwent RRT during the study period, 490 (0.8%) had ESRF secondary to amyloidosis. The median survival of amyloidosis patients on dialysis (2.09 years, 95% CI 1.85-2.32 years) was significantly inferior to that of patients with other causes of ESRF (4.45 years, 95% CI 4.39-4.51 years) (log-rank score 242, P < 0.001). The survival of amyloidosis patients receiving peritoneal dialysis (1.9 years, 95% CI 1.58-2.22) was comparable with those receiving haemodialysis (2.17 years, 95% CI 1.89-2.45) (P = 0.18). Fifty-three (13.8%) amyloidosis patients died of amyloidosis complications. Forty-six patients underwent renal transplantation with first graft survival rates of 45% at 5 years and 26% at 10 years. Nine (16.4%) patients experienced amyloidosis recurrence in their allografts, which led to graft failure in six patients. ESRF patients with amyloidosis experienced inferior median first renal allograft survival (4.55 years, 95% CI 1.96-7.15 versus 10.7 years, 95% CI 10.5-11.0, P = 0.001) and transplant patient survival (6.03 years, 95% CI 2.71-9.36 versus 16.8 years, 95% CI 16.4-17.1, P < 0.001) compared with patients with other causes of ESRF. Respective 10-year patient survival rates were 37 and 69%. Amyloidosis was associated with poor patient survival following dialysis and/or renal transplantation, poor renal allograft survival and a significant incidence of disease recurrence in the allograft. An appreciable proportion of amyloid ESRF patients died of amyloidosis-related complications.
Publisher: Wiley
Date: 09-2015
DOI: 10.1111/AJAG.12231
Abstract: To determine whether the frailty status of patients with chronic kidney disease (CKD) can be measured using a Frailty index (FI). One hundred and eleven attending a nephrology clinic were approached to complete a one-page questionnaire evaluating cognitive, psychological and functional status. Data were coded as deficits, summed and ided by the total number of deficits considered, to derive an FI-CKD. One hundred and ten (mean age 65.2 years) agreed to participate and assessments took approximately 10 minutes to complete. Mean FI-CKD was 0.25 (SD 0.12). The FI-CKD increased with age at 3% per year, correlated with a modified Fried phenotype (P < 0.001) and increased significantly across CKD stages (P = 0.04). The FI-CKD is feasible in the outpatient setting and has good construct validity. The greater granularity of a continuous measure has the potential to inform decision-making regarding appropriate interventions for patients at the 'frail' end of the health spectrum.
Publisher: Oxford University Press (OUP)
Date: 07-09-2012
DOI: 10.1093/NDT/GFS372
Abstract: Preliminary clinical evidence suggests that heme iron polypeptide (HIP) might represent a promising, novel oral iron supplementation strategy in chronic kidney disease. The aim of this multi-centre randomized controlled trial was to determine the ability of HIP administration to augment iron stores in darbepoetin (DPO)-treated patients compared with conventional oral iron supplementation. Adult peritoneal dialysis (PD) patients treated with DPO were randomized 1:1 to receive two capsules daily of either HIP or ferrous sulphate per os for 6 months. The primary outcome measure was transferrin saturation (TSAT). Secondary outcomes comprised serum ferritin, haemoglobin, DPO dose and responsiveness, and adverse events. Sixty-two patients were randomized to HIP (n = 32) or ferrous sulphate (n = 30). On intention-to-treat analysis, the median (inter-quartile range) TSAT was 22% (16-29) in the HIP group compared with 20% (17-26) in controls (P = 0.65). HIP treatment was not significantly associated with TSAT at 6 months on multivariable analysis (P = 0.95). Similar results were found on per-protocol analysis and subgroup analysis in iron-deficient patients. Serum ferritin levels at 6 months were significantly lower in the HIP group (P = 0.003), while the cost of HIP was 7-fold higher than that of ferrous sulphate. No other differences in secondary outcomes were observed. HIP showed no clear safety or efficacy benefit in PD patients compared with conventional oral iron supplements. The reduction in serum ferritin levels and high costs associated with HIP therapy suggest that this agent is unlikely to have a significant role in iron supplementation in PD patients.
Publisher: Elsevier BV
Date: 09-2012
DOI: 10.1016/J.AJPATH.2012.06.009
Abstract: Malignant prostate cancer (PCa) is usually treated with androgen deprivation therapies (ADTs). Recurrent PCa is resistant to ADT. This research investigated whether PCa can potentially produce androgens de novo, making them androgen self-sufficient. Steroidogenic enzymes required for androgen synthesis from cholesterol (CYP11A1, CYP17A1, HSD3β, HSD17β3) were investigated in human primary PCa (n = 90), lymph node metastases (LNMs n = 8), and benign prostatic hyperplasia (BPH n = 6) with the use of IHC. Six prostate cell lines were investigated for mRNA and protein for steroidogenic enzymes and for endogenous synthesis of testosterone and 5α-dihydrotestosterone. All enzymes were identified in PCa, LNMs, BPH, and cell lines. CYP11A1 (rate-limiting enzyme) was expressed in cancerous and noncancerous prostate glands. CYP11A1, CYP17A1, HSD3β, and HSD17β3 were identified, respectively, in 78%, 52%, 16%, and 82% of human BPH and PCa s les. Approximately 10% of primary PCa, LNMs, and BPH expressed all four enzymes simultaneously. CYP11A1 expression was stable, CYP17A1 increased, and HSD3β and HSD17β3 decreased with disease progression. CYP17A1 expression was significantly correlated with CYP11A1 (P = 0.0009), HSD3β (P = 0.0297), and HSD17β3 (P = 0.0090) in vivo, suggesting CYP17A1 has a key role in prostatic steroidogenesis similar to testis and adrenal roles. In vitro, all cell lines expressed mRNA for all enzymes. Protein was not always detectable however, all cell lines synthesized androgen from cholesterol. The results indicate that monitoring steroidogenic metabolites in patients with PCa may provide useful information for therapy intervention.
Publisher: SAGE Publications
Date: 07-2017
Abstract: Few studies have examined the relationship between socio-economic position (SEP) and peritoneal dialysis (PD) outcomes, particularly at a country level. The aim of this study was to investigate the relationships between SEP, technique failure, and mortality in PD patients undertaking treatment in Australia. The study included all Australian non-indigenous incident PD patients between January 1, 1997, and December 31, 2014, using Australia and New Zealand Dialysis and Transplant (ANZDATA) Registry data. The SEP was assessed by quartiles of postcode-based Australian Socio-Economic Indexes for Areas (SEIFA), including Index of Relative Socio-economic Advantage and Disadvantage (IRSAD – primary index), Index of Relative Socio-economic Disadvantage (IRSD), Index of Economic Resources (IER), and Index of Education and Occupation (IEO). Technique and patient survival were evaluated by multivariable Cox proportional hazards survival analyses. The study included 9,766 patients (mean age 60.6 ± 15 years, 57% male, 38% diabetic). Using multivariable Cox regression, no significant association was observed between quartiles of IRSAD and technique failure (30-day definition p = 0.65, 180-day definition p = 0.68). Similar results were obtained using competing risks regression. However, higher SEP, defined by quartiles of IRSAD, was associated with better patient survival (Quartile 1 reference Quartile 2 adjusted hazards ratio [HR] 0.96, 95% confidence interval [CI] 0.86 – 1.06 Quartile 3 HR 0.87, 95% CI 0.77 – 0.99 Quartile 4 HR 0.86, 95% CI 0.76 – 0.97). Similar results were found when IRSD was analyzed, but results were no longer statistically significant for IER and IEO. In Australia, where there is universal free healthcare, SEP was not associated with PD technique failure in non-indigenous PD patients. Higher SEP was generally associated with improved patient survival.
Publisher: SAGE Publications
Date: 03-2015
Abstract: The ability of urinary biomarkers to predict residual renal function (RRF) decline in peritoneal dialysis (PD) patients has not been defined. The present study aimed to explore the utility of established biomarkers from kidney injury models for predicting loss of RRF in incident PD patients, and to evaluate the impact on RRF of using neutral-pH PD solution low in glucose degradation products. The study included 50 randomly selected participants from the balANZ trial who had completed 24 months of follow-up. A change in glomerular filtration rate (GFR) was used as the primary clinical outcome measure. In a mixed-effects general linear model, baseline measurements of 18 novel urinary biomarkers and albumin were used to predict GFR change. The model was further used to evaluate the impact of biocompatible PD solution on RRF, adjusted for each biomarker. Baseline albuminuria was not a useful predictor of change in RRF in PD patients ( p = 0.84). Only clusterin was a significant predictor of GFR decline in the whole population ( p = 0.04, adjusted for baseline GFR and albuminuria). However, the relationship was no longer apparent when albuminuria was removed from the model ( p = 0.31). When the effect of the administered PD solutions was examined using a model adjusted for PD solution type, baseline albuminuria, and GFR, higher baseline urinary concentrations of trefoil factor 3 (TFF3, p = 0.02), kidney injury molecule 1 (KIM-1, p = 0.04), and interferon γ–induced protein 10 (IP-10, p = 0.03) were associated with more rapid decline of RRF in patients receiving conventional PD solution compared with biocompatible PD solution. Higher urinary levels of kidney injury biomarkers (TFF3, KIM-1, IP-10) at baseline predicted significantly slower RRF decline in patients receiving bio-compatible PD solutions. Findings from the present investigation should help to guide future studies to validate the utility of urinary biomarkers as tools to predict RRF decline in PD patients.
Publisher: Springer Science and Business Media LLC
Date: 16-09-2010
Publisher: SAGE Publications
Date: 09-2016
Abstract: Previous studies have reported significant variation in peritonitis rates across dialysis centers. Limited evidence is available to explain this variability. The aim of this study was to assess center-level predictors of peritonitis and their relationship with peritonitis rate variations. All incident peritoneal dialysis (PD) patients treated in Australia between October 2003 and December 2013 were included. Data were accessed through the Australia and New Zealand Dialysis and Transplant Registry. The primary outcome was peritonitis rate, evaluated in a mixed effects negative binomial regression model. Peritonitis-free survival was assessed as a secondary outcome in a Cox proportional hazards model. Overall, 8,711 incident PD patients from 51 dialysis centers were included in the study. Center-level predictors of lower peritonitis rates included smaller center size, high proportion of PD, low peritoneal equilibration test use at PD start, and low proportion of hospitalization for peritonitis. In contrast, a low proportion of automated PD exposure, high icodextrin exposure and low or high use of antifungal prophylaxis at the time of peritonitis were associated with a higher peritonitis rate. Similar results were obtained for peritonitis-free survival. Overall, accounting for center-level characteristics appreciably decreased peritonitis variability among dialysis centers ( p = 0.02). This study identified specific center-level characteristics associated with the variation in peritonitis risk. Whether these factors are directly related to peritonitis risk or surrogate markers for other center characteristics is uncertain and should be validated in further studies.
Publisher: Wiley
Date: 22-04-2008
Publisher: SAGE Publications
Date: 07-2015
Abstract: Cardiovascular mortality has remained high in patients on peritoneal dialysis (PD) due to the high prevalence of various cardiovascular complications including coronary artery disease, left ventricular hypertrophy and dysfunction, heart failure, arrhythmia (especially atrial fibrillation), cerebrovascular disease, and peripheral arterial disease. In addition, nearly a quarter of PD patients develop sudden cardiac death as the terminal life event. Thus, it is essential to identify effective treatment that may lower cardiovascular mortality and improve survival of PD patients. The International Society for Peritoneal Dialysis (ISPD) commissioned a global workgroup in 2012 to formulate a series of recommendation statements regarding lifestyle modification, assessment and management of various cardiovascular risk factors, and management of the various cardiovascular complications to be published in 2 guideline documents. This publication forms the second part of the guideline documents and includes recommendation statements on the management of various cardiovascular complications in adult chronic PD patients. The documents are intended to serve as a global clinical practice guideline for clinicians who look after PD patients. We also define areas where evidence is clearly deficient and make suggestions for future research in each specific area.
Publisher: Wiley
Date: 12-09-2017
DOI: 10.1111/JORC.12213
Abstract: International guidelines recommend treatment of anaemia due to chronic kidney disease (CKD) with erythropoiesis-stimulating agents (ESAs). To document the time required and the cost in terms of nursing time to prepare and administer ESAs to patients on facility based haemodialysis (HD) with anaemia due to CKD before and after the introduction of long-acting ESAs. A time and motion study was implemented at four HD units in Australia to determine the time and costs associated with preparing and administering ESAs before and after the introduction of long-acting ESAs. This was a prospective, observational study of workplace practices at four HD units in Australia. Outcome data included the time taken to prepare, and administer ESAs. The time costs of preparation and administration per patient per year had a wide variability within each unit and ranged from Australian AUD$55.75 (38 euros) to AUD$90.49 (62 euros) before the introduction of long-acting ESAs. This dropped by 73-80% following the introduction of long-acting ESAs, representing an annual cost savings of between AUD$2,591 and AUD$5,914 if all patients on HD were switched to a long acting ESA. Switching from a short-acting to a long-acting ESA in HD units leads to a significant reduction in time costs of health professionals in preparation and administration of ESAs by up to 80%. Practical application: This time and motion study has added further evidence on reduction of human effort by taking advantages of new research development, such as the long acting ESAs.
Publisher: SAGE Publications
Date: 07-2015
Abstract: Cardiovascular disease contributes significantly to the adverse clinical outcomes of peritoneal dialysis (PD) patients. Numerous cardiovascular risk factors play important roles in the development of various cardiovascular complications. Of these, loss of residual renal function is regarded as one of the key cardiovascular risk factors and is associated with an increased mortality and cardiovascular death. It is also recognized that PD solutions may incur significant adverse metabolic effects in PD patients. The International Society for Peritoneal Dialysis (ISPD) commissioned a global workgroup in 2012 to formulate a series of recommendations regarding lifestyle modification, assessment and management of various cardiovascular risk factors, as well as management of the various cardiovascular complications including coronary artery disease, heart failure, arrhythmia (specifically atrial fibrillation), cerebrovascular disease, peripheral arterial disease and sudden cardiac death, to be published in 2 guideline documents. This publication forms the first part of the guideline documents and includes recommendations on assessment and management of various cardiovascular risk factors. The documents are intended to serve as a global clinical practice guideline for clinicians who look after PD patients. The ISPD workgroup also identifies areas where evidence is lacking and further research is needed.
Publisher: Springer Science and Business Media LLC
Date: 18-03-2021
DOI: 10.1186/S12882-021-02279-0
Abstract: Patients on chronic dialysis are at increased risk of postoperative mortality following elective surgery compared to patients with normal kidney function, but morbidity outcomes are less often reported. This study ascertains the excess odds of postoperative cardiovascular and infection related morbidity outcomes for patients on chronic dialysis. Systematic searches were performed using MEDLINE, Embase and the Cochrane Library to identify relevant studies published from inception to January 2020. Eligible studies reported postoperative morbidity outcomes in chronic dialysis and non-dialysis patients undergoing major non-transplant surgery. Risk of bias was assessed using the Newcastle-Ottawa Scale and the certainty of evidence was summarised using GRADE. Random effects meta-analyses were performed to derive summary odds estimates. Meta-regression and sensitivity analyses were performed to explore heterogeneity. Forty-nine studies involving 10,513,934 patients with normal kidney function and 43,092 patients receiving chronic dialysis were included. Patients on chronic dialysis had increased unadjusted odds of postoperative cardiovascular and infectious complications within each surgical discipline. However, the excess odds of cardiovascular complications was attenuated when odds ratios were adjusted for age and comorbidities myocardial infarction (general surgery, OR 1.83 95% 1.29–2.36) and stroke (general surgery, OR 0.95, 95%CI 0.84–1.06). The excess odds of infectious complications remained substantially higher for patients on chronic dialysis, particularly sepsis (general surgery, OR 2.42, 95%CI 2.12–2.72). Patients on chronic dialysis are at increased odds of both cardiovascular and infectious complications following elective surgery, with the excess odds of cardiovascular complications attributable to being on dialysis being highest among younger patients without comorbidities. However, further research is needed to better inform perioperative risk assessment.
Publisher: SAGE Publications
Date: 03-2017
Publisher: SAGE Publications
Date: 03-2014
Abstract: There is limited available evidence regarding the role of monitoring serum gentamicin concentrations in peritoneal dialysis (PD) patients receiving this antimicrobial agent in gram-negative PD-associated peritonitis. Using data collected in all patients receiving PD at a single center who experienced a gram-negative peritonitis episode between 1 January 2005 and 31 December 2011, we investigated the relationship between measured serum gentamicin levels on day 2 following initial empiric antibiotic therapy and subsequent clinical outcomes of confirmed gram-negative peritonitis. Serum gentamicin levels were performed on day 2 in 51 (77%) of 66 first gram-negative peritonitis episodes. Average serum gentamicin levels on day 2 were 1.83 ± 0.84 mg/L with levels exceeding 2 mg/L in 22 (43%) cases. The overall cure rate was 64%. No cases of ototoxicity were observed. Day-2 gentamicin levels were not significantly different between patients who did and did not have a complication or cure. Using multivariable logistic regression analysis, failure to cure peritonitis was not associated with either day-2 gentamicin level (adjusted odds ratio (OR) 0.96, 95% confidence interval (CI) 0.25 – 3.73) or continuation of gentamicin therapy beyond day 2 (OR 0.28, 0.02 – 3.56). The only exception was polymicrobial peritonitis, where day-2 gentamicin levels were significantly higher in episodes that were cured (2.06 ± 0.41 vs 1.29 ± 0.71, p = 0.01). In 17 (26%) patients receiving extended gentamicin therapy, day-5 gentamicin levels were not significantly related to peritonitis cure. Day-2 gentamicin levels did not predict gentamicin-related harm or efficacy during short-course gentamicin therapy for gram-negative PD-related peritonitis, except in cases of polymicrobial peritonitis, where higher levels were associated with cure.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 02-2016
DOI: 10.2215/CJN.05240515
Publisher: Oxford University Press (OUP)
Date: 06-2012
DOI: 10.1093/NDT/GFR274
Abstract: Aminoglycosides offer several potential benefits in their treatment of peritoneal dialysis (PD)-associated peritonitis, including low cost, activity against Gram-negative organisms (including Pseudomonas aeruginosa), synergistic bactericidal activity against some Gram-positive organisms (such as Staphylococci) and relatively low propensity to promote antimicrobial resistance. However, there is limited conflicting evidence that aminoglycosides may accelerate loss of residual renal function (RRF) in PD patients. The aim of this study was to study the effect of aminoglycoside use on slope of decline in RRF. The study included 2715 Australian patients receiving PD between October 2003 and December 2007 in whom at least two measurements of renal creatinine clearance were available. Patients were ided according to tertiles of slope of RRF decline (rapid, intermediate and slow). The primary outcome was the slope of RRF over time in patients who received aminoglycosides for PD peritonitis versus those who did not. A total of 1412 patients (52%) experienced at least one episode of PD peritonitis. An aminoglycoside was used as the initial empiric antibiotic in 1075 patients. The slopes of RRF decline were similar in patients treated and not treated with at least one course of aminoglycoside (median [interquartile range] -0.26 [-1.17 to 0.04] mL/min/1.73 m(2)/month versus -0.22 [-1.11 to 0.01] mL/min/1.73 m(2)/month, P = 0.9). The slopes of RRF decline were also similar in patients receiving repeated courses of aminoglycoside. Empiric treatment with aminoglycoside for peritonitis was not associated with an adverse effect on RRF in PD patients.
Publisher: Oxford University Press (OUP)
Date: 19-07-2012
DOI: 10.1093/NDT/GFR397
Abstract: High cardiovascular risk in chronic kidney disease (CKD) patients appears only partly attributable to atherosclerosis, with much of the remaining risk being ascribed to other vasculature abnormalities, including endothelial dysfunction, arterial stiffness and vascular calcification (VC). To date, these factors have been primarily studied in isolation or in dialysis patients. This study performed a global vascular assessment in moderate CKD and assessed the relationships with both traditional and novel risk factors. This was a prospective cross-sectional analysis of 120 patients (age 60 ± 10 years estimated glomerular filtration rate 25-60 mL/min/1.73m(2)). Demographic, clinical and biochemical characterization was performed. VC was characterized by lateral lumbar radiograph arterial stiffness by aortic pulse-wave velocity (PWV) atheroma burden by carotid intima-media thickness (cIMT) and endothelial function by flow-mediated dilation (FMD) of the brachial artery. VC was highly prevalent (74%), and FMD generally poor (FMDΔ 3.3 ± 3.3%). There were significant correlations between all vascular parameters although these were predominantly explained by age. cIMT was independently associated with classical risks and also PWV (adjusted standardized β = 0.31, P = 0.001). However, traditional risks showed almost no independent associations with other vascular measurements. In contrast, serum phosphate and 1,25-dihydroxyvitamin D (1,25-OHD) correlated with PWV and the presence of VC, respectively. After adjustment, every 1 pg/mL increase in 1,25-OHD was related to a 3% reduction in the chance of VC (odds ratio 0.97 95% confidence interval 0.94-1.00, P = 0.03). Medication use, HOMA-IR and C-reactive protein did not correlate with any of the vascular measures. This study demonstrates extensive vascular disease across multimodality imaging in moderate CKD. Atherosclerotic burden correlated with traditional risks and PWV, while higher 1,25-OHD was associated with less VC. The lack of association between renal function and imaging indices raises the possibility of a threshold, rather than graded uraemic effect on vascular health that warrants further exploration.
Publisher: Elsevier BV
Date: 09-2011
DOI: 10.1053/J.AJKD.2011.03.022
Abstract: The causes, predictors, treatment, and outcomes of relapsed and recurrent peritoneal dialysis (PD)-associated peritonitis are poorly understood. Observational cohort study using Australia and New Zealand Dialysis and Transplant (ANZDATA) Registry data. All Australian PD patients between October 1, 2003, and December 31, 2007, with first episodes of peritonitis. Demographic, clinical, and facility variables and type of peritonitis relapse (same organism or culture-negative episode occurring within 4 weeks of completion of therapy of a prior episode or 5 weeks if vancomycin used) recurrence (different organism occurring within 4 weeks of completion of therapy of a prior episode or 5 weeks if vancomycin used) control (first peritonitis episode without relapse or recurrence). Hospitalization, catheter removal, hemodialysis therapy transfer, death. Of 6,024 PD patients studied, first episodes of relapsed, recurrent, and control peritonitis occurred in 356, 165, and 2,021 patients, respectively. Coagulase-negative staphylococci and Staphylococcus aureus accounted for 48% of relapsing peritonitis (adjusted OR, 1.26 [95% CI, 0.94-1.70] and 1.54 [95% CI, 1.08-2.19], respectively), but were much less likely to be isolated in recurrent peritonitis. Recurrent peritonitis was associated more frequently with fungi (13% OR, 2.16 95% CI, 1.12-4.17). The empirical antimicrobial approaches to relapsing and recurrent peritonitis were similar and their subsequent clinical outcomes were comparable. Compared with uncomplicated peritonitis, relapsed and recurrent peritonitis were associated with higher rates of catheter removal (22% vs 30% vs 37%, respectively P < 0.001) and permanent hemodialysis therapy transfer (20% vs 25% vs 32% P < 0.001), but similar rates of hospitalization (73% vs 70% vs 70%) and death (2.8% vs 2.0% vs 1.2%). Limited covariate adjustment. Residual confounding and coding bias could not be excluded. Relapsed and recurrent peritonitis are caused by different spectra of micro-organisms, but are not readily clinically distinguishable at presentation. Empirical treatment with broad-spectrum antibiotics and subsequent adjustment according to antimicrobial susceptibilities results in similar clinical outcomes, albeit with appreciably higher rates of catheter removal and hemodialysis therapy transfer than for uncomplicated peritonitis.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 2022
Abstract: This systematic review summarized evidence from randomized controlled trials concerning benefits and risks of noncalcium-based phosphate-lowering treatment in nondialysis CKD compared with placebo, calcium-based phosphate binders, and no study medication. Noncalcium-based phosphate-lowering therapy reduced serum phosphate and urinary phosphate excretion, but with unclear effect on clinical outcomes and intermediate cardiovascular end points. There was an associated increase risk of constipation and vascular calcification with noncalcium-based phosphate binders compared with placebo. This study highlights the need for more adequately powered trials to evaluate the benefits and risks of phosphate-lowering therapy on patient-centered outcomes in people with CKD. Benefits of phosphate-lowering interventions on clinical outcomes in patients with CKD are unclear systematic reviews have predominantly involved patients on dialysis. This study aimed to summarize evidence from randomized controlled trials (RCTs) concerning benefits and risks of noncalcium-based phosphate-lowering treatment in nondialysis CKD. We conducted a systematic review and meta-analyses of RCTs involving noncalcium-based phosphate-lowering therapy compared with placebo, calcium-based binders, or no study medication, in adults with CKD not on dialysis or post-transplant. RCTs had ≥3 months follow-up and outcomes included biomarkers of mineral metabolism, cardiovascular parameters, and adverse events. Outcomes were meta-analyzed using the Sidik–Jonkman method for random effects. Unstandardized mean differences were used as effect sizes for continuous outcomes with common measurement units and Hedge’s g standardized mean differences (SMD) otherwise. Odds ratios were used for binary outcomes. Cochrane risk of bias and GRADE assessment determined the certainty of evidence. In total, 20 trials involving 2498 participants (median s le size 120, median follow-up 9 months) were eligible for inclusion. Overall, risk of bias was low. Compared with placebo, noncalcium-based phosphate binders reduced serum phosphate (12 trials, weighted mean difference -0.37 95% CI, -0.58 to -0.15 mg/dl, low certainty evidence) and urinary phosphate excretion (eight trials, SMD -0.61 95% CI, -0.90 to -0.31, low certainty evidence), but resulted in increased constipation (nine trials, log odds ratio [OR] 0.93 95% CI, 0.02 to 1.83, low certainty evidence) and greater vascular calcification score (three trials, SMD, 0.47 95% CI, 0.17 to 0.77, very low certainty evidence). Data for effects of phosphate-lowering therapy on cardiovascular events (log OR, 0.51 95% CI, -0.51 to 1.17) and death were scant. Noncalcium-based phosphate-lowering therapy reduced serum phosphate and urinary phosphate excretion, but there was an unclear effect on clinical outcomes and intermediate cardiovascular end points. Adequately powered RCTs are required to evaluate benefits and risks of phosphate-lowering therapy on patient-centered outcomes.
Publisher: MDPI AG
Date: 03-04-2013
Publisher: Elsevier BV
Date: 10-2017
Publisher: Springer Science and Business Media LLC
Date: 12-2009
Publisher: Frontiers Media SA
Date: 20-07-2020
DOI: 10.1111/TRI.13681
Publisher: Elsevier BV
Date: 11-2011
DOI: 10.1016/J.BIOCEL.2011.08.003
Abstract: Renal cell carcinoma (RCC), the commonest type of kidney cancer, is a highly metastatic and the deadliest of all urologic cancers. Despite the development of many novel chemotherapeutics in recent years, metastatic RCC remains an incurable and lethal disease. The imperative for the identification of novel molecular targets and more effective therapeutics for metastatic RCC remain. One promising target is the transcription factor nuclear factor kappa B (NF-κB). NF-κB is unique in the sense that it regulates all important aspects of RCC biology that pose challenge to conventional therapy - resistance to apoptosis, angiogenesis and multi-drug resistance. Aberrations in the von Hippel Lindau gene (VHL) are the most important risk factor for the development of RCC, especially the clear cell type, which constitutes 70-80% of RCC. VHL is a negative regulator of NF-κB. In the absence of a functional VHL, the expression and activity of NF-κB are enhanced, which subsequently confer drug resistance and promote epithelial-mesenchymal-transition of RCC. This review provides an overview of RCC, its molecular mechanisms, the role of NF-κB in carcinomas including RCC, and the rationale for NF-κB as a target molecule.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 06-2014
DOI: 10.2215/CJN.09730913
Publisher: Wiley
Date: 2003
DOI: 10.1002/PATH.1368
Abstract: Caveolae and their proteins, the caveolins, transport macromolecules compartmentalize signalling molecules and are involved in various repair processes. There is little information regarding their role in the pathogenesis of significant renal syndromes such as acute renal failure (ARF). In this study, an in vivo rat model of 30 min bilateral renal ischaemia followed by reperfusion times from 4 h to 1 week was used to map the temporal and spatial association between caveolin-1 and tubular epithelial damage (desquamation, apoptosis, necrosis). An in vitro model of ischaemic ARF was also studied, where cultured renal tubular epithelial cells or arterial endothelial cells were subjected to injury initiators modelled on ischaemia-reperfusion (hypoxia, serum deprivation, free radical damage or hypoxia-hyperoxia). Expression of caveolin proteins was investigated using immunohistochemistry, immunoelectron microscopy, and immunoblots of whole cell, membrane or cytosol protein extracts. In vivo, healthy kidney had abundant caveolin-1 in vascular endothelial cells and also some expression in membrane surfaces of distal tubular epithelium. In the kidneys of ARF animals, punctate cytoplasmic localization of caveolin-1 was identified, with high intensity expression in injured proximal tubules that were losing basement membrane adhesion or were apoptotic, 24 h to 4 days after ischaemia-reperfusion. Western immunoblots indicated a marked increase in caveolin-1 expression in the cortex where some proximal tubular injury was located. In vitro, the main treatment-induced change in both cell types was translocation of caveolin-1 from the original plasma membrane site into membrane-associated sites in the cytoplasm. Overall, expression levels did not alter for whole cell extracts and the protein remained membrane-bound, as indicated by cell fractionation analyses. Caveolin-1 was also found to localize intensely within apoptotic cells. The results are indicative of a role for caveolin-1 in ARF-induced renal injury. Whether it functions for cell repair or death remains to be elucidated.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 21-07-2016
DOI: 10.2215/CJN.00190116
Abstract: Emerging evidence from recently published observational studies and an in idual patient data meta–analysis shows that mammalian target of rapamycin inhibitor use in kidney transplantation is associated with increased mortality. Therefore, all-cause mortality and allograft loss were compared between use and nonuse of mammalian target of rapamycin inhibitors in patients from Australia and New Zealand, where mammalian target of rapamycin inhibitor use has been greater because of heightened skin cancer risk. Our longitudinal cohort study included 9353 adult patients who underwent 9558 kidney transplants between January 1, 1996 and December 31, 2012 and had allograft survival ≥1 year. Risk factors for all-cause death and all–cause and death–censored allograft loss were analyzed by multivariable Cox regression using mammalian target of rapamycin inhibitor as a time-varying covariate. Additional analyses evaluated mammalian target of rapamycin inhibitor use at fixed time points of baseline and 1 year. Patients using mammalian target of rapamycin inhibitors were more likely to be white and have a history of pretransplant cancer. Over a median follow-up of 7 years, 1416 (15%) patients died, and 2268 (24%) allografts were lost. There was a higher risk of all-cause mortality with time–varying mammalian target of rapamycin inhibitor use (hazard ratio, 1.47 95% confidence interval, 1.23 to 1.76) as well as in the fixed time model analyses comparing mammalian target of rapamycin inhibitor use at baseline (hazard ratio, 1.54 95% confidence interval, 1.22 to 1.93) and 1 year (hazard ratio, 1.63 95% confidence interval, 1.32 to 2.01). Time–varying mammalian target of rapamycin inhibitor use was associated with higher risk of death because of malignancy (hazard ratio, 1.37 95% confidence interval, 1.09 to 1.71). There were no statistically significant differences in the risk of all–cause (hazard ratio, 0.98 95% confidence interval, 0.85 to 1.12) and death–censored (hazard ratio, 0.85 95% confidence interval, 0.69 to 1.03) allograft loss between the mammalian target of rapamycin inhibitor use and nonuse groups in the time-varying model as well as the fixed time models. Mammalian target of rapamycin inhibitor use was associated with a higher risk of all-cause mortality but not allograft loss.
Publisher: Elsevier BV
Date: 2015
Publisher: Wiley
Date: 10-2005
Publisher: SAGE Publications
Date: 09-2018
Abstract: Glucose is the most commonly used osmotic medium in peritoneal dialysis (PD) solutions, and its use has been associated with both local and systemic adverse effects. Previous, single-center, observational cohort studies have reported conflicting findings regarding whether a relationship exists between peritoneal glucose exposure and peritoneal small solute transport rate. In this secondary analysis of the balANZ multi-center, multinational, randomized controlled trial of a neutral pH, ultra-low glucose degradation product (biocompatible) versus conventional PD solutions over a 2-year period, the relationship between time varying peritoneal glucose exposure and change in peritoneal solute transport rate, (measured as dialysate to plasma creatinine ratio at 4 hours [D:PCr 4h ]), was evaluated using multivariable, multilevel linear regression. Baseline peritoneal glucose exposure was also assessed as either a continuous or categorical variable. The study included 165 patients (age 58.1 ± 14.2 years, 55% male, 33% diabetic). Peritoneal glucose exposure increased over time (coefficient 1.49, 95% confidence interval [CI] 1.07 – 1.92 and was not significantly associated with change in D:PCr 4h (coefficient 0.00004, 95% CI -0.0001 – 0.0002, p = 0.68). Similar results were found when peritoneal glucose exposure was examined as a baseline continuous or categorical variable. A significant 2-way interaction was observed with PD solution type, whereby a progressive increase in D:PCr 4h was seen in the patients receiving conventional PD solution, but not in those receiving biocompatible solution. Increases in peritoneal solute transport rate in PD patients over time were not associated with peritoneal glucose exposure, although a strong and positive association with PD solution glucose degradation product content was identified. Peritoneal glucose exposure may be a less important consideration than peritoneal glucose degradation product exposure with respect to peritoneal membrane function over time.
Publisher: Wiley
Date: 02-2005
DOI: 10.1111/J.1440-1797.2005.00363.X
Abstract: Interleukin (IL)-1beta, a pro-inflammatory macrophage-derived cytokine, is implicated as a key mediator of interstitial fibrosis and tubular loss or injury in progressive renal insufficiency. This study investigates some of the mechanisms of action of IL-1beta on the proximal tubule. Confluent cultures of primary human proximal tubule cells (PTC) were incubated in serum-free media supplemented with either IL-1beta (0-4 ng/mL), phorbol-12-myristate 13-acetate (PMA, protein kinase C activator) (6.25-100 nmol/L), or vehicle (control), together with a non-specific protein kinase C inhibitor (H7), a specific protein kinase C inhibitor (BIM-1), an anti-oxidant (NAC) or a NADPH oxidase inhibitor (AEBSF). Interleukin-1beta-treated PTC exhibited time-dependent increases in fibronectin secretion (ELISA), cell injury (LDH release) and reactive nitrogen species (RNS) release (Griess assay). Proximal tubule cell DNA synthesis (thymidine incorporation) was also significantly suppressed. The effects of IL-1beta, which were reproduced by incubation of PTC with PMA (6.25-100 nmol/L), were blocked by H7 but not by BIM-1. The anti-oxidant (4 mmol/L) partially blocked IL-1beta-induced fibronectin secretion by PTC, but did not affect IL-1beta-induced LDH release, RNS release or growth inhibition. The NADPH oxidase inhibitor (AEBSF) significantly attenuated all observed deleterious effects of IL-1beta on PTC. Interleukin-1beta directly induces proximal tubule injury, extracellular matrix production and impaired growth. The anti-oxidant, NAC, appears to ameliorate part of the fibrogenic effect of IL-1beta on PTC through mechanisms that do not significantly involve protein kinase C activation or nitric oxide release.
Publisher: Elsevier BV
Date: 08-2018
DOI: 10.1053/J.AJKD.2017.10.019
Abstract: Concern regarding technique failure is a major barrier to increased uptake of peritoneal dialysis (PD), and the first year of therapy is a particularly vulnerable time. A cohort study using competing-risk regression analyses to identify the key risk factors and risk periods for early transfer to hemodialysis therapy or death in incident PD patients. All adult patients who initiated PD therapy in Australia and New Zealand in 2000 through 2014. Patient demographics and comorbid conditions, duration of prior renal replacement therapy, timing of referral, PD modality, dialysis era, and center size. Technique failure within the first year, defined as transfer to hemodialysis therapy for more than 30 days or death. Of 16,748 patients included in the study, 4,389 developed early technique failure. Factors associated with increased risk included age older than 70 years, diabetes or vascular disease, prior renal replacement therapy, late referral to a nephrology service, or management in a smaller center. Asian or other race and use of continuous ambulatory PD were associated with reduced risk, as was initiation of PD therapy in 2010 through 2014. Although the risk for technique failure due to death or infection was constant during the first year, mechanical and other causes accounted for a greater number of cases within the initial 9 months of treatment. Potential for residual confounding due to limited data for residual kidney function, dialysis prescription, and socioeconomic factors. Several modifiable and nonmodifiable factors are associated with early technique failure in PD. Targeted interventions should be considered in high-risk patients to avoid the consequences of an unplanned transfer to hemodialysis therapy or death.
Publisher: SAGE Publications
Date: 03-2007
DOI: 10.1177/089686080702700216
Abstract: The aim of this study was to investigate the factors affecting recovery and durability of dialysis-independent renal function following commencement of peritoneal dialysis (PD). Retrospective, observational cohort study of the Australian and New Zealand PD patient population. Australia and New Zealand Dialysis and Transplant (ANZDATA) Registry. The study reviewed all patients in Australia and New Zealand who commenced PD for treatment of end-stage renal failure between 15 May 1963 and 31 December 2004. The primary outcomes examined were recovery of dialysis-independent renal function and time from PD commencement to recovery of renal function. A secondary outcome measure was time to renal death (patient death or recommencement of renal replacement therapy) following recovery of dialysis-independent renal function. 24663 patients commenced PD during the study period. Of these, 253 (1%) recovered dialysis-independent renal function. An increased likelihood of recovery was predicted by autoimmune renal disease, hemolytic-uremic syndrome, paraproteinemia, cortical necrosis, renovascular disease, and treatment in New Zealand. A reduced likelihood of recovery was associated with polycystic kidney disease and indigenous race. Analysis of a contemporary subset of 14743 patients in whom complete data were available for body mass index, smoking, and comorbidities yielded comparable results, except that increasing age was additionally associated with a decreased likelihood of recovery. Of the 253 patients who recovered renal function, 151 (60%) recommenced renal replacement therapy and 49 (19%) died within a median period of 226 days (interquartile range 110 – 581 days). The only significant predictors of continued renal survival after renal recovery were autoimmune renal disease and cortical necrosis. Recovery of renal function in patients treated with PD is rare and determined mainly by renal disease type and race. In the majority of cases, recovery is short term. The apparently high rate of early patient death or return to dialysis after recovery of renal function on PD raises questions about the appropriateness of discontinuing PD therapy under such circumstances.
Publisher: Wiley
Date: 10-2005
Publisher: Elsevier BV
Date: 09-2020
Publisher: S. Karger AG
Date: 1996
DOI: 10.1159/000174050
Abstract: Multifrequency bio-electrical impedance analysis (MFBIA) was evaluated as a technique for monitoring changes in extracellular and total body water (ECW and TBW, respectively) of 15 subjects during dialysis. Dilution analysis, using deuterium oxide and sodium bromide, was also performed on each subject before dialysis so that prediction equations for ECW and TBW based on the MFBIA measures could be developed. These prediction equations were then used to estimate water compartment volume changes during dialysis and compared with volumetric measures of the dialysate removed. The results show that MFBIA does not accurately measure ECW and TBW changes during dialysis. The MFBIA measures tend to overestimate the changes and are not sufficiently precise to be clinically useful.
Publisher: Elsevier BV
Date: 06-2018
DOI: 10.1053/J.AJKD.2017.10.017
Abstract: Peritonitis is a common cause of technique failure in peritoneal dialysis (PD). Dialysis center-level characteristics may influence PD peritonitis outcomes independent of patient-level characteristics. Retrospective cohort study. Using Australia and New Zealand Dialysis and Transplant Registry (ANZDATA) data, all incident Australian PD patients who had peritonitis from 2004 through 2014 were included. Patient- (including demographic data, causal organisms, and comorbid conditions) and center- (including center size, proportion of patients treated with PD, and summary measures related to type, cause, and outcome of peritonitis episodes) level predictors. The primary outcome was cure of peritonitis with antibiotics. Secondary outcomes were peritonitis-related catheter removal, hemodialysis therapy transfer, peritonitis relapse/recurrence, hospitalization, and mortality. Outcomes were analyzed using multilevel mixed logistic regression. The study included 9,100 episodes of peritonitis among 4,428 patients across 51 centers. Cure with antibiotics was achieved in 6,285 (69%) peritonitis episodes and varied between 38% and 86% across centers. Centers with higher proportions of dialysis patients treated with PD (>29%) had significantly higher odds of peritonitis cure (adjusted OR, 1.21 95% CI, 1.04-1.40) and lower odds of catheter removal (OR, 0.78 95% CI, 0.62-0.97), hemodialysis therapy transfer (OR, 0.78 95% CI, 0.62-0.97), and peritonitis relapse/recurrence (OR, 0.68 95% CI, 0.48-0.98). Centers with higher proportions of peritonitis episodes receiving empirical antibiotics covering both Gram-positive and Gram-negative organisms had higher odds of cure with antibiotics (OR, 1.22 95% CI, 1.06-1.42). Patient-level characteristics associated with higher odds of cure were younger age and less virulent causative organisms (coagulase-negative staphylococci, streptococci, and culture negative). The variation in odds of cure across centers was 9% higher after adjustment for patient-level characteristics, but 66% lower after adjustment for center-level characteristics. Retrospective study design using registry data. These results suggest that center effects contribute substantially to the appreciable variation in PD peritonitis outcomes that exist across PD centers within Australia.
Publisher: SAGE Publications
Date: 09-2016
Publisher: SAGE Publications
Date: 03-2015
Abstract: There is limited available evidence regarding the role of monitoring serum vancomycin concentrations during treatment of peritoneal dialysis (PD)-associated peritonitis. A total of 150 PD patients experiencing 256 episodes of either gram-positive or culture-negative peritonitis were included to investigate the relationship between measured serum vancomycin within the first week and clinical outcomes of cure, relapse, repeat or recurrence of peritonitis, catheter removal, temporary or permanent transfer to hemodialysis, hospitalization and death. Vancomycin was used as an initial empiric antibiotic in 54 gram-positive or culture-negative peritonitis episodes among 34 patients. The median number of serum vancomycin level measurements in the first week was 3 (interquartile range IQR 1 - 4). The mean day-2 vancomycin level, measured in 34 (63%) episodes, was 17.5 ± 5.2 mg/L. Hospitalized patients were more likely to have serum vancomycin levels measured on day 2 and ≥ 3 measurements in the first week. The peritonitis cure rates were similar between patients with 3 and ≥ 3 measurements in the first week (77% vs 57%, p = 0.12) and if day-2 vancomycin levels were measured or not (68% vs 65%, p = 0.84). The average day-2 (18.0 ± 5.9 vs 16.6 ± 3.2, p = 0.5), first-week average (18.6 ± 5.1 vs 18.6 ± 4.3, p = 0.9) and nadir (14.5 ± 4.1 vs 13.6 ± 4.2, p = 0.5) vancomycin levels were comparable in patients who did or did not achieve peritonitis cure. Similar results were observed for all other clinical outcomes. The clinical outcomes of gram-positive and culture-negative peritonitis episodes are not associated with either the frequency or levels of serum vancomycin measurements in the first week of treatment when vancomycin is dosed according to International Society for Peritoneal Dialysis (ISPD) Guidelines.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 09-2002
DOI: 10.1097/00007890-200209150-00015
Abstract: Although obesity has been associated with improved survival on dialysis, its effects on renal transplant outcomes remain unclear. Previous studies have reported conflicting findings and have been limited by the use of outdated patient data, univariate analyses, and liberal transplant selection criteria. The present study aimed to evaluate the effect of obesity on renal transplant outcomes in a rigorously screened population. A retrospective analysis was undertaken of all patients transplanted at the Princess Alexandra Hospital from 1 April 1994 to 31 March 2000. Patients were rigorously screened for cardiovascular disease before acceptance for transplantation. The effects of obesity on renal transplant outcomes were assessed by logistic and multivariate Cox regressions. Of the 493 patients transplanted, 59 (12%) were obese (body mass index [BMI] 30 kg/m ). Obese patients were more likely to experience superficial wound breakdown (14% vs. 4%, P<0.01) and complete wound dehiscence (3% vs. 0%, P<0.01). Wound infections also tended to be more frequent in obese recipients (15% vs. 8%, P=0.11). There were no significant differences between the two groups with respect to operative duration, postoperative complications, hospitalization, delayed graft function, or acute rejection episodes. Five-year actuarial survival rates were comparable between the two groups with respect to graft survival (83% vs. 84%, P=NS) and patient survival (91% vs. 91%, P=NS). On multivariate analysis, BMI was an independent risk factor for wound breakdown (odds ratio 1.21, 95% CI 1.09-1.34, P<0.001), but not for other posttransplant complications, hospitalization, graft loss, or patient survival. The only significant adverse effect of obesity on renal transplant outcomes was an increase in wound complications, which were generally of minor consequence. Provided that adequate care is taken to avoid transplanting patients with significant cardiovascular disease, obese recipients can achieve excellent long-term patient and graft survivals that are on par with their nonobese counterparts. Denying patients access to renal transplantation on the basis of obesity per se does not appear to be justified.
Publisher: SAGE Publications
Date: 2016
Abstract: Peritoneal dialysis (PD) patients develop progressive and cumulative peritoneal injury with longer time spent on PD. The present study aimed to a) describe the trend of peritoneal injury biomarkers, matrix metalloproteinase-2 (MMP-2) and tissue inhibitor of metalloproteinase-1 (TIMP-1), in incident PD patients, b) to explore the capacity of dialysate MMP-2 to predict peritoneal solute transport rate (PSTR) and peritonitis, and c) to evaluate the influence of neutral pH, low glucose degradation product (GDP) PD solution on these outcomes. The study included 178 participants from the balANZ trial who had at least 1 stored dialysate s le. Changes in PSTR and peritonitis were primary outcome measures, and the utility of MMP-2 in predicting these outcomes was analyzed using multilevel linear regression and multilevel Poisson regression, respectively. Significant linear increases in dialysate MMP-2 and TIMP-1 concentrations were observed ( p 0.001), but neither was affected by the type of PD solutions received (MMP-2: p = 0.07 TIMP-1: p = 0.63). An increase in PSTR from baseline was associated with higher levels of MMP-2 ( p = 0.02), and the use of standard solutions over longer PD duration ( p = 0.001). The risk of peritonitis was independently predicted by higher dialysate MMP-2 levels (incidence rate ratio [IRR] per ng/mL 1.01, 95% confidence interval [CI] 1.005 – 1.02, p = 0.002) and use of standard solutions (Biocompatible solution: IRR 0.45, 95% CI 0.24 – 0.85, p = 0.01). Dialysate MMP-2 and TIMP-1 concentrations increased with longer PD duration. Higher MMP-2 levels were associated with faster PSTR and future peritonitis risk. Administration of biocompatible solutions exerted no significant effect on dialysate levels of MMP-2 or TIMP-1, but did counteract the increase in PSTR and the risk of peritonitis associated with the use of standard PD solutions. This is the first longitudinal study to examine the clinical utility of MMP-2 as a predictor of patient-level outcomes.
Publisher: Bentham Science Publishers Ltd.
Date: 25-01-2016
DOI: 10.2174/1574887110666151026123235
Abstract: As a consequence of both traditional and non-traditional risk factors, cardiovascular disease is over-represented, and the leading cause of mortality, among patients with Chronic Kidney Disease (CKD). Whilst recommendations for reducing cardiovascular risk in the general population exist, their applicability to the CKD population is questionable due to the exclusion of CKD patients from the majority of contemporary cardiovascular interventional studies. The aim of this review is to critically evaluate the literature regarding pharmacologic cardiovascular interventions in patients with CKD, with an emphasis on studies published since our 2008 review. Interventions discussed include erythropoiesis-stimulating agents (TREAT, U.S. Normal Hematocrit, CHOIR, CREATE, Palmer meta-analysis) statins (SHARP, AURORA, PPP, 4D, ALERT) Fibrates (VA-HIT) Folic Acid (ASFAST, US FOLIC acid trial, HOST) Antihypertensive Agents, Including Angiotensin-Converting Enzyme Inhibitors, angiotensin-receptor blockers, Beta-blockers and Combination therapy (Cice et al, FOSDIAL, Agarwal et al, ONTARGET) sevelamer (DCOR) Cinacalcet (ADVANCE, EVOLVE, Cunningham meta-analysis) Anti-oxidants (SPACE, HOPE, ATIC) Aspirin (HOT study re-analysis) vitamin D analogues (PRIMO) and multidisciplinary intervention (LANDMARK). Unfortunately, there remains a paucity of evidence in this area and a large number of methodologically poor quality studies with negative results. It is possible that these interventions do not have the same positive effect in CKD patients due to differences in the pathogenesis driving cardiovascular disease burden, such as altered bone metabolism and calcific vascular disease. Further well-designed studies with appropriately selected study populations and patient level outcomes are required. Until such time, physicians must consider on an in idual patient basis the appropriateness of these interventions.
Publisher: Wiley
Date: 06-02-2021
DOI: 10.1111/CTR.14235
Publisher: SAGE Publications
Date: 2015
Abstract: The utility of local and systemic interleukin 6 (IL-6) as a prognostic marker in incident peritoneal dialysis (PD) patients remains to be fully defined. The present study aimed to explore the capacity of systemic IL-6 concentrations to predict cardiovascular events (CVEs) and mortality in PD patients, and to evaluate the influence of neutral-pH PD solutions low in glucose degradation products (GDPs) on systemic IL-6. The study included 175 incident participants from the balANZ trial with at least one stored serum s le. A composite CVE score was used as the primary clinical outcome measure. Multilevel linear regression and Poisson regression models were fitted to describe, respectively, the trend of serum IL-6 over time and its ability to predict composite CVE. A significant increase in serum IL-6 from baseline to 24 months was observed in the study population (mean difference: 1.68 pg/mL p = 0.006). The type of PD solution received by patients exerted no significant effect on serum IL-6 ( p = 0.12). Composite CVE was significantly and independently associated with baseline serum IL-6 (incidence rate ratio per picogram per milliliter: 1.06 95% confidence interval: 1.02 to 1.10 p = 0.003). Baseline serum IL-6 was a significant independent predictor of composite CVE. Serum IL-6 concentrations increased with increasing PD duration and were not significantly modified with the use of biocompatible fluid over the study period. The present study is the first to link systemic IL-6 concentrations with CVE outcomes in incident PD patients.
Publisher: Informa UK Limited
Date: 2008
DOI: 10.1080/00365510802144953
Abstract: The reporting of an eGFR (estimated glomerular filtration rate) with every requested serum creatinine concentration measurement has been successfully introduced as routine practice in Australia and New Zealand. This change in laboratory practice has been linked with a major educational initiative in the diagnosis and management of chronic kidney disease as well as standardization of a range of laboratory measurement and reporting issues. The process has been collaborative between renal physicians, chemical pathologists and laboratory scientists and their respective professional bodies, and the relevant decisions have been made collectively on the best available evidence. The initial guidelines were released in August 2005 and these have been followed up in 2007 with further recommendations to address issues arising since that time.
Publisher: AMPCo
Date: 2000
Publisher: American Medical Association (AMA)
Date: 09-05-2017
Publisher: Springer Science and Business Media LLC
Date: 17-05-2001
Abstract: It is well established that ACE-inhibitors should be avoided in patients with renal artery stenosis. In recent years it has also been recommended that caution should be demonstrated when angiotensin II blockers are used in the same type of patients but the evidence is based only on few cases. We describe a case where use of the angiotensin II antagonist candesartan (Atacand) induced renal failure in a patient with bilateral renal artery stenosis. The course of the case is enlighted by results from sequential renography, selective renal vein catheterisation for measurement of renin, and angiographic findings. In patients with renal artery stenosis the angiotensin II antagonist candesartan should be avoided.
Publisher: Wiley
Date: 08-12-2010
Publisher: SAGE Publications
Date: 15-02-2022
DOI: 10.1177/08968608221078903
Abstract: Although caregivers allow peritoneal dialysis (PD) patients with disabilities the opportunity to perform PD, it is crucial to clarify the safety and effectiveness of assisted PD performed by caregivers compared to self-PD. PD patients from 22 PD centres in Thailand were prospectively followed in the Peritoneal Dialysis Outcomes and Practice Patterns Study during 2016-2017. Patients receiving assisted PD performed by caregivers were matched 1:1 with self-PD patients using propensity scores calculated by logistic regression. The associations between assisted PD and risk of mortality, peritonitis and permanent transfer to haemodialysis (HD) were assessed by multivariable competing risk regression. Of 778 eligible patients, 447 (57%) required assisted PD performed by caregivers. Most of the caregivers were family members (98%), while the rest were non-family paid caregivers (2%). Patient factors associated with assisted PD were older age, female gender, lower educational level, cardiovascular comorbidities, diabetes, automated PD modality, poorer functional status and lower blood chemistries (albumin, creatinine, sodium, potassium and phosphate). After 1:1 matching, the baseline characteristics were adequately matched, and 269 patients in each group were analysed. Compared with self-PD, assisted PD was significantly associated with an increased risk of all-cause mortality (adjusted sub-hazard ratio: 2.15, 95% confidence interval: 1.24-3.74). There were no differences in the occurrences of peritonitis and permanent HD transfer between the groups. Assisted PD was required by more than half of Thai PD patients and was independently associated with a higher mortality risk. This may reflect causal effect or confounding by indication.
Publisher: SAGE Publications
Date: 09-2016
Abstract: The aim of the present study was to evaluate the predictors of transfer to home hemodialysis (HHD) after peritoneal dialysis (PD) completion. All Australian and New Zealand patients treated with PD on day 90 after initiation of renal replacement therapy between 2000 and 2012 were included. Completion of PD was defined by death, transplantation, or hemodialysis (HD) for 180 days or more. Patients were categorized as “transferred to HHD” if they initiated HHD fewer than 180 days after PD had ended. Multivariable logistic regression was used to evaluate predictors of transfer to HHD in a restricted cohort experiencing PD technique failure a competing-risks analysis was used in the unrestricted cohort. Of 10 710 incident PD patients, 3752 died, 1549 underwent transplantation, and 2915 transferred to HD, among whom 156 (5.4%) started HHD. The positive predictors of transfer to HHD in the restricted cohort were male sex [odds ratio (OR): 2.81], obesity (OR: 2.20), and PD therapy duration (OR: 1.10 per year). Negative predictors included age (OR: 0.95 per year), infectious cause of technique failure (OR: 0.48), underweight (OR: 0.50), kidney disease resulting from hypertension (OR: 0.38) or diabetes (OR: 0.32), race being Maori (OR: 0.65) or Aboriginal and Torres Strait Islander (OR: 0.30). Comparable results were obtained with a competing-risks model. Transfer to HHD after completion of PD is rare and predicted by patient characteristics at baseline and at the time of PD end. Transition to HHD should be considered more often in patients using PD, especially when they fulfill the identified characteristics.
Publisher: Elsevier BV
Date: 11-2020
Publisher: Springer Science and Business Media LLC
Date: 14-04-2021
DOI: 10.1186/S13063-021-05200-0
Abstract: The unprecedented demand placed on healthcare systems from the COVID-19 pandemic has forced a reassessment of clinical trial conduct and feasibility. Consequently, the Australasian Kidney Trials Network (AKTN), an established collaborative research group known for conducting investigator-initiated global clinical trials, had to efficiently respond and adapt to the changing landscape during COVID-19. Key priorities included ensuring patient and staff safety, trial integrity and network sustainability for the kidney care community. New resources have been developed to enable a structured review and contingency plan of trial activities during the pandemic and beyond.
Publisher: Elsevier BV
Date: 09-2020
Publisher: Springer Science and Business Media LLC
Date: 27-09-2018
Publisher: BMJ
Date: 17-07-2018
DOI: 10.1136/ARCHDISCHILD-2018-314934
Abstract: The aim was to compare quality of life (QoL) among children and adolescents with different stages of chronic kidney disease (CKD) and determine factors associated with changes in QoL. Cross-sectional. The Kids with CKD study involved five of eight paediatric nephrology units in Australia and New Zealand. There were 375 children and adolescents (aged 6–18 years) with CKD, on dialysis or transplanted, recruited between 2013 and 2016. Overall and domain-specific QoL were measured using the Health Utilities Index 3 score, with a scale from −0.36 (worse than dead) to 1 (perfect health). QoL scores were compared between CKD stages using the Mann-Whitney U test. Factors associated with changes in QoL were assessed using multivariable linear and ordinal logistic regression. QoL for those with CKD stages 1–2 (n=106, median 0.88, IQR 0.63–0.96) was higher than those on dialysis (n=43, median 0.67, IQR 0.39–0.91, p .001), and similar to those with kidney transplants (n=135, median 0.83, IQR 0.59–0.97, p=0.4) or CKD stages 3–5 (n=91, 0.85, IQR 0.60–0.98). Reductions were most frequent in the domains of cognition (50%), pain (42%) and emotion (40%). The risk factors associated with decrements in overall QoL were being on dialysis (decrement of 0.13, 95% CI 0.02 to 0.25, p=0.02), lower family income (decrement of 0.10, 95% CI 0.03 to 0.15, p=0.002) and short stature (decrement of 0.09, 95% CI 0.01 to 0.16, p=0.02). The overall QoL and domains such as pain and emotion are substantially worse in children on dialysis compared with earlier stage CKD and those with kidney transplants.
Publisher: Wiley
Date: 02-05-2020
DOI: 10.1111/NEP.13574
Abstract: The use of haemodiafiltration (HDF) for the management of patients with end-stage kidney failure is increasing worldwide. Factors associated with HDF use have not been studied and may vary in different countries and jurisdictions. The aim of this study was to document the pattern of increase and variability in uptake of HDF in Australia and New Zealand, and to describe patient- and centre-related factors associated with its use. Using the Australian and New Zealand Dialysis and Transplant Registry, all incident patients commencing haemodialysis (HD) between 2000 and 2014 were included. The primary outcome was HDF commencement over time, which was evaluated using multivariable logistic regression stratified by country. Of 27 433 patients starting HD, 3339 (14.4%) of 23 194 patients in Australia and 810 (19.1%) of 4239 in New Zealand received HDF. HDF uptake increased over time in both countries but was more rapid in New Zealand than Australia. In Australia, HDF use was more likely in males (odds ratio (OR) 1.13, 95% confidence interval (CI) = 1.03-1.24, P = 0.009) and less likely with older age (reference 70 years OR = 0.48 95% CI = 0.41-0.56) higher body mass index (body mass index (BMI) < 18.5 kg/m Haemodiafiltration uptake is increasing, variable and associated with both patient and centre characteristics. Centre characteristics not explicitly captured elsewhere explained 36% of variability in HDF uptake in Australia and 48% in New Zealand.
Publisher: Springer Science and Business Media LLC
Date: 24-10-2010
DOI: 10.1007/S10495-009-0414-Y
Abstract: One of the impeding factors in the effective treatment of metastatic renal cell carcinoma (RCC) is their intrinsic and acquired resistance to chemotherapeutics. Many studies have shown that drug resistance, at least in part, is mediated by the upregulation of anti-apoptotic (Bcl-2) and multidrug resistance molecules (MDR-1 and MRP-1) by the transcription factor nuclear factor kappa B (NF-kappaB). Combining NF-kappaB inhibitors with conventional chemotherapeutics could overcome resistance of cancer cells. In this study, we examined the synergistic effect of pyrrolidine dithiocarbamate (PDTC), a NF-kappaB inhibitor, and cisplatin, on two human metastatic RCC cell lines ACHN and SN12K1. In idual non-toxic concentrations of PDTC and cisplatin, when combined, synergistically induced a significant increase in apoptosis of the two RCC cell lines. In ACHN cells, the groups with nuclear translocation of NF-kappaB showed resistance to apoptosis, but in SN12K1 cells, the groups with NF-kappaB translocation were susceptible to apoptosis. The combination treatment significantly decreased the transcription activity of all NF-kappaB subunits in both cell lines. Anti-apoptotic proteins Bcl-2 and Bcl-(XL) were significantly decreased in the combination therapy group of both cell lines, but MDR-1 was decreased only in the ACHN cells. No changes in MRP-1 were observed in any of the treatment groups. The results demonstrate the potential of PDTC to be an adjunct therapeutic agent. The major mechanism of the synergistic effect appears to be mediated by the inhibition of transcription activity of NF-kappaB rather than its expression, and the resultant decrease in the anti-apoptotic proteins Bcl-2 and Bcl-(XL).
Publisher: Springer Science and Business Media LLC
Date: 16-09-2022
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 2022
DOI: 10.2215/CJN.07800621
Abstract: Nutrition intervention is an essential component of kidney disease management. This study aimed to understand current global availability and capacity of kidney nutrition care services, interdisciplinary communication, and availability of oral nutrition supplements. The International Society of Renal Nutrition and Metabolism (ISRNM), working in partnership with the International Society of Nephrology (ISN) Global Kidney Health Atlas Committee, developed this Global Kidney Nutrition Care Atlas. An electronic survey was administered among key kidney care stakeholders through 182 ISN-affiliated countries between July and September 2018. Overall, 160 of 182 countries (88%) responded, of which 155 countries (97%) answered the survey items related to kidney nutrition care. Only 48% of the 155 countries have dietitians/renal dietitians to provide this specialized service. Dietary counseling, provided by a person trained in nutrition, was generally not available in 65% of low-/lower middle–income countries and “never” available in 23% of low-income countries. Forty-one percent of the countries did not provide formal assessment of nutrition status for kidney nutrition care. The availability of oral nutrition supplements varied globally and, mostly, were not freely available in low-/lower middle–income countries for both inpatient and outpatient settings. Dietitians and nephrologists only communicated “sometimes” on kidney nutrition care in ≥60% of countries globally. This survey reveals significant gaps in global kidney nutrition care service capacity, availability, cost coverage, and deficiencies in interdisciplinary communication on kidney nutrition care delivery, especially in lower-income countries.
Publisher: Oxford University Press (OUP)
Date: 28-01-2016
DOI: 10.1093/NDT/GFU403
Abstract: Patient training has widely been considered to be one of the most critical factors for achieving optimal peritoneal dialysis clinical outcomes, including avoidance of peritonitis. However, research in this important area has been remarkably scant to date. This article will critically review the clinical evidence underpinning PD patient training and will specifically focus on four key areas: who should provide training and how, when and where should it be performed to obtain the best results. Evidence gaps and future research directions will also be discussed.
Publisher: John Wiley & Sons, Ltd
Date: 16-12-2015
Publisher: Hindawi Limited
Date: 2017
DOI: 10.1155/2017/7570352
Abstract: Objectives . To compare the effectiveness of real acupressure versus sham acupressure therapy in improving sleep quality in patients receiving hemodialysis (HD) or hemodiafiltration (HDF). Methods . A multicenter, single-blind, randomized controlled trial was conducted in two Australian dialysis units located in Princess Alexandra Hospital and Logan Hospital, respectively. Forty-two subjects with self-reported poor sleep quality were randomly assigned to real ( n = 21 ) or sham ( n = 21 ) acupressure therapy delivered thrice weekly for four consecutive weeks during routine dialysis sessions. The primary outcome was the Pittsburgh Sleep Quality Index (PSQI) score measured at week four adjusted for baseline PSQI measurements. Secondary outcomes were quality of life (QOL) (SF-8), adverse events, and patient acceptability (treatment acceptability questionnaire, TAQ). Results . The two groups were comparable on global PSQI scores (difference 0.19, 95% confidence interval [CI] −1.32 to 1.70) and on the subscale scores. Similar results were observed for QOL both in the mental (difference −3.88, 95% CI −8.63 to 0.87) and the physical scores (difference 2.45, 95% CI −1.69 to 6.58). There were no treatment-related adverse events and acupressure was perceived favorably by participants. Conclusion . Acupressure is a safe, well-tolerated, and highly acceptable therapy in adult hemodialysis patients in a Western healthcare setting with uncertain implications for therapeutic efficacy.
Publisher: Springer Science and Business Media LLC
Date: 25-08-2020
DOI: 10.1186/S12882-020-01978-4
Abstract: Reliable estimates of the absolute and relative risks of postoperative complications in kidney transplant recipients undergoing elective surgery are needed to inform clinical practice. This systematic review and meta-analysis aimed to estimate the odds of both fatal and non-fatal postoperative outcomes in kidney transplant recipients following elective surgery compared to non-transplanted patients. Systematic searches were performed through Embase and MEDLINE databases to identify relevant studies from inception to January 2020. Risk of bias was assessed by the Newcastle Ottawa Scale and quality of evidence was summarised in accordance with GRADE methodology (grading of recommendations, assessment, development and evaluation). Random effects meta-analysis was performed to derive summary risk estimates of outcomes. Meta-regression and sensitivity analyses were performed to explore heterogeneity. Fourteen studies involving 14,427 kidney transplant patients were eligible for inclusion. Kidney transplant recipients had increased odds of postoperative mortality cardiac surgery (OR 2.2, 95%CI 1.9–2.5), general surgery (OR 2.2, 95% CI 1.3–4.0) compared to non-transplanted patients. The magnitude of the mortality odds was increased in the presence of diabetes mellitus. Acute kidney injury was the most frequently reported non-fatal complication whereby kidney transplant recipients had increased odds compared to their non-transplanted counterparts. The odds for acute kidney injury was highest following orthopaedic surgery (OR 15.3, 95% CI 3.9–59.4). However, there was no difference in the odds of stroke and pneumonia. Kidney transplant recipients are at increased odds for postoperative mortality and acute kidney injury following elective surgery. This review also highlights the urgent need for further studies to better inform perioperative risk assessment to assist in planning perioperative care.
Publisher: Wiley
Date: 19-01-2007
Publisher: Wiley
Date: 2004
DOI: 10.1111/J.1444-0903.2004.T01-6-.X
Abstract: Abstract Early renal insufficiency (ERI), defined as a calculated or measured glomerular filtration rate (GFR) between 30 and 60 mL/min per 1.73 m 2 , is present in more than 10% of the adult Australian population. This pernicious condition is frequently unrecognised, progressive and accompanied by multiple associated comorbidities, including hypertension, renal osteodystrophy, anaemia, sleep apnoea, cardiovascular disease, hyperparathyroidism and malnutrition. Several treatments have been suggested to retard GFR decline in ERI, including blood pressure reduction, angiotensin‐converting enzyme inhibition, angiotensin receptor antagonism, calcium channel blockade, cholesterol reduction, smoking cessation, erythropoietin therapy, dietary protein restriction, intensive glycaemic control and early intensive multidisciplinary patient education within a renal unit. In addition, specific interventions have been reported to be renoprotective in atherosclerotic renal artery stenosis, diabetic nephropathy, lupus nephritis and certain forms of primary glomerulonephritis. The present paper reviews the available published randomised controlled clinical trials and meta‐analyses supporting (or refuting) a role for each of these therapeutic manoeuvres. (Intern Med J 34: 50–57)
Publisher: Elsevier BV
Date: 12-2008
DOI: 10.1080/00313020802436402
Abstract: Caveolin-1 (cav1) is reported to have both cell survival and pro-apoptotic characteristics. This may be explained by its localisation or phosphorylation in injured cells. This study investigated the role of cav1 in kidney cells of different nephron origin and developmental state after oxidative stress. Renal MCDK distal tubular, HK2 proximal tubular epithelial cells and HEK293T renal embryonic cells were treated with 1 mM hydrogen peroxide. Apoptosis, loss of cell adhesion, and cell survival were compared with expression of cav1 in its non-phosphorylated and phosphorylated (p-cav1) forms. Cav1 was transfected into the HEK293T cells, or caveolae were disrupted with filipin or nystatin in HK2 cells, to investigate functions of cav1 and p-cav1. Oxidative stress induced more apoptosis in HK2s than MDCKs (p < 0.05). HK2s had lower endogenous cav1 and p-cav1 than MDCKs (p < 0.05). Both cell lines had increased p-cav1, but not cav1, with oxidative stress. This increase was greatest in MDCKs (p < 0.01). Cav1 was located mainly in the plasma membrane of untreated cells and translocated to the cytoplasm with oxidative stress in both cell lines, more so in MDCKs. Disruption of caveolae caused cytoplasmic translocation of cav1 in HK2s, but did not alter high levels of oxidative stress-induced apoptosis. When HEK293Ts lacking endogenous cav1 were transfected with cav1, oxidant-induced apoptosis and loss of cell adhesion was decreased (p < 0.01), and p-cav1 was induced by treatment. Cav1 expression and localisation in kidney cells is not anti-apoptotic, but increased expression of p-cav1 may promote cell survival after oxidative stress.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 2014
DOI: 10.2215/CJN.02310213
Abstract: The Initiating Dialysis Early and Late study showed that planned early or late initiation of dialysis, based on the Cockcroft and Gault estimation of GFR, was associated with identical clinical outcomes. This study examined the association of all-cause mortality with estimated GFR at dialysis commencement, which was determined using multiple formulas. Initiating Dialysis Early and Late trial participants were stratified into tertiles according to the estimated GFR measured by Cockcroft and Gault, Modification of Diet in Renal Disease, or Chronic Kidney Disease-Epidemiology Collaboration formula at dialysis commencement. Patient survival was determined using multivariable Cox proportional hazards model regression. Only Initiating Dialysis Early and Late trial participants who commenced on dialysis were included in this study ( n =768). A total of 275 patients died during the study. After adjustment for age, sex, racial origin, body mass index, diabetes, and cardiovascular disease, no significant differences in survival were observed between estimated GFR tertiles determined by Cockcroft and Gault (lowest tertile adjusted hazard ratio, 1.11 95% confidence interval, 0.82 to 1.49 middle tertile hazard ratio, 1.29 95% confidence interval, 0.96 to 1.74 highest tertile reference), Modification of Diet in Renal Disease (lowest tertile hazard ratio, 0.88 95% confidence interval, 0.63 to 1.24 middle tertile hazard ratio, 1.20 95% confidence interval, 0.90 to 1.61 highest tertile reference), and Chronic Kidney Disease-Epidemiology Collaboration equations (lowest tertile hazard ratio, 0.93 95% confidence interval, 0.67 to 1.27 middle tertile hazard ratio, 1.15 95% confidence interval, 0.86 to 1.54 highest tertile reference). Estimated GFR at dialysis commencement was not significantly associated with patient survival, regardless of the formula used. However, a clinically important association cannot be excluded, because observed confidence intervals were wide.
Publisher: Wiley
Date: 03-12-2010
DOI: 10.1002/PTR.3038
Abstract: ACE inhibitors (ACEi) reduce renal tubulointerstitial fibrosis but are not completely effective. Combined extract of Astragalus membranaceus and Angelica sinensis (A&A) is a traditional antifibrotic agent in China. The present investigation aimed to determine whether an ACEi (Enalapril) and A&A together have a better antifibrotic effect in unilateral ureteral obstruction (UUO) than monotherapy with either agent. Male Sprague-Dawley rats (N = 4 per group) had either sham operation or UUO alone, with A&A (combined aqueous and ethanol extract equivalent to 2.1 g dried herbs), with Enalapril (in drinking water at 200 mg/mL) or with both treatments. Kidney and liver were collected for protein extraction or fixed for histologic stains, immunohistochemistry (IHC), microscopy. Enalapril or A&A in idually were antifibrotic. Transforming growth factor-beta1, fibroblast activation, collagen deposition, macrophage accumulation and tubular cell apoptosis were all decreased. The combination of the two drugs was significantly more effective than Enalapril alone in reducing tumor necrosis factor-alpha, collagen accumulation, activation of fibroblasts, and tubular cell apoptosis. In conclusion, Enalapril with A&A significantly decreased tubulointerstitial fibrosis to a greater extent than treatment with Enalapril alone. Further studies focusing on the isolation of the active constituents of A&A and the clinical application of the combination of ACEi plus A&A are warranted to determine the value of this treatment in humans.
Publisher: Wiley
Date: 02-05-2020
DOI: 10.1111/NEP.13570
Abstract: Involving consumers (patients, carers and family members) across all stages of research is gaining momentum in the nephrology community. Scientific meetings present a partnership opportunity with consumers for dissemination of research findings. The Better Evidence and Translation in Chronic Kidney Disease (BEAT-CKD) research collaboration, in partnership with Kidney Health Australia, convened two consumer sessions at the 54th Australian and New Zealand Society of Nephrology Annual Scientific Meeting held in September 2018. The educational objectives, topics and session formats were informed by members of the Better Evidence and Translation-Chronic Kidney Disease Consumer Advisory Board (which at the time comprised 36 consumers from around Australia with varied experience of kidney disease). Patients, health professionals and researchers facilitated and presented at the sessions. In-person and live-streaming attendance options were available, with over 400 total participants across the two sessions. Sessions were also video recorded for dissemination and later viewing. Evaluations demonstrated consumers found the presentations informative, relevant and accessible. Attendees indicated strong interest in participating in similar sessions at future scientific meetings. We propose a framework for partnering with consumers as organisers, facilitators, speakers and attendees at scientific meetings in nephrology.
Publisher: SAGE Publications
Date: 10-05-2022
DOI: 10.1177/08968608221096560
Abstract: Life participation is an outcome of critical importance to patients receiving peritoneal dialysis (PD). However, there is no widely accepted or validated tool for measuring life participation in patients receiving PD. Online consensus workshop to identify the essential characteristics of life participation as a core outcome, with the goal of establishing a patient-reported outcome measure for use in all trials in patients receiving PD. Thematic analysis of transcripts was performed. Fifty-six participants, including 17 patients and caregivers, from 15 countries convened via online videoconference. Four themes were identified: reconfiguring expectations of daily living (accepting day-to-day fluctuation as the norm, shifting thresholds of acceptability, preserving gains in flexibility and freedom), ensuring broad applicability and interpretability (establishing cross-cultural relevance, incorporating valued activities, distinguishing unmodifiable barriers to life participation), capturing transitions between modalities and how they affect life participation (responsive to trajectory towards stable, reflecting changes with dialysis transitions) and maximising feasibility of implementation (reducing completion burden, administrable with ease and flexibility). There is a need for a validated, generalisable outcome measure for life participation in patients receiving PD. Feasibility, including length of time to complete and flexible mode of delivery, are important to allow implementation in all trials that include patients receiving PD.
Publisher: Elsevier BV
Date: 2021
Publisher: Elsevier BV
Date: 06-2009
DOI: 10.1053/J.AJKD.2008.12.037
Abstract: Evidence-based clinical practice guidelines have been a major development in nephrology internationally, but it is uncertain how the nephrology community regards these guidelines. This study aimed to determine the views of nephrologists on the content and effects of their local guidelines (Caring for Australasians with Renal Impairment [CARI]). In 2006, a self-administered survey was distributed to all Australian and New Zealand nephrologists. Seven questions were repeated from a similar survey in 2002. A total of 211 nephrologists (70% of practicing nephrologists) responded. More than 90% agreed that the CARI guidelines were a useful summary of evidence, and nearly 60% reported that the guidelines had significantly influenced their practice. The proportion of nephrologists reporting that the guidelines had improved patient outcomes increased from 14% in 2002 to 38% in 2006. The proportion of nephrologists indicating that the guidelines did not match the best available evidence decreased from 30% in 2002 to 8% in 2006. Older age and male sex showed some associations with a less favorable response for some domains. The CARI approach of rigorous evidence-based guidelines has been shown to be a successful model of guideline production. Almost all nephrologists regarded the CARI guidelines as useful evidence summaries, although only one-third believed that the guidelines affected health outcomes. Attitudes to the guidelines have become more favorable over time this may reflect changes in the CARI process or attitudinal changes to evidence among nephrologists. Evaluation by the end user is fundamental to ensuring the applicability of guidelines in clinical practice in the future.
Publisher: Elsevier BV
Date: 05-2021
Publisher: Springer Science and Business Media LLC
Date: 28-07-2009
Abstract: The main hypothesis of this study is that oral heme iron polypeptide (HIP Proferrin ® ES) administration will more effectively augment iron stores in erythropoietic stimulatory agent (ESA)-treated peritoneal dialysis (PD) patients than conventional oral iron supplementation (Ferrogradumet ® ). Inclusion criteria are peritoneal dialysis patients treated with darbepoietin alpha (DPO Aranesp ® , Amgen) for ≥ 1 month. Patients will be randomized 1:1 to receive either slow-release ferrous sulphate (1 tablet twice daily control) or HIP (1 tablet twice daily) for a period of 6 months. The study will follow an open-label design but outcome assessors will be blinded to study treatment. During the 6-month study period, haemoglobin levels will be measured monthly and iron studies (including transferring saturation [TSAT] measurements) will be performed bi-monthly. The primary outcome measure will be the difference in TSAT levels between the 2 groups at the end of the 6 month study period, adjusted for baseline values using analysis of covariance (ANCOVA). Secondary outcome measures will include serum ferritin concentration, haemoglobin level, DPO dosage, Key's index (DPO dosage ided by haemoglobin concentration), and occurrence of adverse events (especially gastrointestinal adverse events). This investigator-initiated multicentre study has been designed to provide evidence to help nephrologists and their peritoneal dialysis patients determine whether HIP administration more effectively augments iron stores in ESP-treated PD patients than conventional oral iron supplementation. Australia New Zealand Clinical Trials Registry number ACTRN12609000432213.
Publisher: Elsevier BV
Date: 05-2021
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 11-2019
Publisher: Elsevier BV
Date: 08-2024
Publisher: SAGE Publications
Date: 11-2016
Abstract: Although technique failure is a key outcome in peritoneal dialysis (PD), there is currently no agreement on a uniform definition. We explored different definitions of PD technique failure using data from the Australia and New Zealand Dialysis and Transplant (ANZDATA) Registry. We included 16,612 incident PD patients in Australia and New Zealand from January 1998 to December 2012. Different definitions of technique failure were applied according to the minimum number of days (30, 60, 90, 180, or 365) the patient received hemodialysis after cessation of PD. Median technique survival varied from 2.0 years with the 30-day definition to 2.4 years with the 365-day definition. For all definitions, the most common causes of technique failure were death, followed by infectious complications. The likelihood of a patient returning to PD within 12 months of technique failure was highest in the 30-day definition (24%), and was very small when using the 180- and 365-day definitions (3% and 0.8%, respectively). Patients whose technique failed due to mechanical reasons were the most likely to return to PD (46% within 12 months using the 30-day definition). Both 30- and 180-day definitions have clinical relevance but offer different perspectives with very different prognostic implications for further PD. Therefore, we propose that PD technique failure be defined by a composite endpoint of death or transfer to hemodialysis using both 30-day and 180-day definitions.
Publisher: Elsevier BV
Date: 05-2021
Publisher: Wiley
Date: 10-2004
DOI: 10.1111/J.1440-1797.2004.00310.X
Abstract: In recent years, an increasing percentage of people from industrialized countries have been using complementary and alternative medicines (CAM). This, combined with numerous warnings regarding the potential toxicity of these therapies, suggests the need for practitioners to keep abreast of the reported incidence of renal toxicity caused by the ingestion of medicinal herbs. The goal of the present two-part series, on the toxic or beneficial effects of medicinal herbs on renal health, is to provide practitioners with a summary of the most recent information as well as the means by which evidence for benefit or toxicity has been found. In this first article, we explore in vivo evidence of toxicity. Included are nephrotoxicity from aristolochic acid and other components within herbs, herb--drug interactions resulting in adverse renal effects, and renal toxicity from contaminants within the extracts. The review aims to provide a guide to encourage future toxicity studies and rigorous clinical trials.
Publisher: Springer Science and Business Media LLC
Date: 14-10-2009
DOI: 10.1007/S10456-009-9158-0
Abstract: Renal cell carcinomas (RCC) are a heterogeneous group of cancers that often include angiogenesis as a key clinical and pathological hallmark. As the transcription factor nuclear factor kappa B (NF-kappaB) has been identified as a key promoter of angiogenesis, NF-kappaB inhibitors could serve as potential anti-angiogenic agents. In this study, we tested the anti-angiogenic properties of pyrrolidine dithiocarbamate (PDTC), a NF-kappaB inhibitor, using established in vitro and ex vivo assays in human umbilical vein endothelial cells (HUVEC) and human metastatic RCC cell lines (ACHN and SN12K1). In vitro, PDTC inhibited proliferation, capillary tube formation, invasion and trans-differentiation of HUVEC. Ex vivo, PDTC blocked vessel sprouting from aortic explants and disrupted the integrity of pre-formed vessels. PDTC also inhibited the adhesion of HUVEC and RCC cells to substratum and inhibited their invasion. PDTC inhibited RCC-induced proliferation of HUVEC. Protein microarray demonstrated heterogenic actions in each cell line: in HUVEC, epidermal growth factor was significantly decreased in ACHN, basic fibroblast growth factor, growth related oncogene, interleukin-6, RANTES and monocyte chemoattractant protein-1 (MCP-1) were significantly decreased and in SN12K1, MCP-1 was upregulated. PDTC increased the expression of the pro-angiogenic protein interleukin-8 in both RCC cell lines. Expression of vascular endothelial growth factor (VEGF) was not significantly altered, and exogenous VEGF had a neutral effect on in vitro angiogenesis of HUVEC. Collectively, these data suggest that PDTC has some anti-angiogenic properties, which may limit development and progression of RCC.
Publisher: Springer Science and Business Media LLC
Date: 20-08-2009
Abstract: Catheter-related bacteraemias (CRBs) contribute significantly to morbidity, mortality and health care costs in dialysis populations. Despite international guidelines recommending avoidance of catheters for haemodialysis access, hospital admissions for CRBs have doubled in the last decade. The primary aim of the study is to determine whether weekly instillation of 70% ethanol prevents CRBs compared with standard heparin saline. The study will follow a prospective, open-label, randomized controlled design. Inclusion criteria are adult patients with incident or prevalent tunneled intravenous dialysis catheters on three times weekly haemodialysis, with no current evidence of catheter infection and no personal, cultural or religious objection to ethanol use, who are on adequate contraception and are able to give informed consent. Patients will be randomized 1:1 to receive 3 mL of intravenous-grade 70% ethanol into each lumen of the catheter once a week and standard heparin locks for other dialysis days, or to receive heparin locks only. The primary outcome measure will be time to the first episode of CRB, which will be defined using standard objective criteria. Secondary outcomes will include adverse reactions, incidence of CRB caused by different pathogens, time to infection-related catheter removal, time to exit site infections and costs. Prospective power calculations indicate that the study will have 80% statistical power to detect a clinically significant increase in median infection-free survival from 200 days to 400 days if 56 patients are recruited into each arm. This investigator-initiated study has been designed to provide evidence to help nephrologists reduce the incidence of CRBs in haemodialysis patients with tunnelled intravenous catheters. Australian New Zealand Clinical Trials Registry Number: ACTRN12609000493246
Publisher: Elsevier BV
Date: 05-2021
Publisher: Elsevier BV
Date: 11-2019
DOI: 10.1053/J.AJKD.2019.05.011
Abstract: Compared to combination therapy, intraperitoneal (IP) cefepime monotherapy for continuous ambulatory peritoneal dialysis (CAPD)-associated peritonitis may provide potential benefits in lowering staff burden, shortening time-consuming antibiotic preparation, and reducing bag contamination risk. This study sought to evaluate whether cefepime monotherapy is noninferior to combination regimens. Multicenter, open-label, noninferiority, randomized, controlled trial. Adult incident peritoneal dialysis (PD) patients with CAPD-associated peritonitis in 8 PD centers in Thailand. Random assignment to either IP monotherapy of cefepime, 1g/d, or IP combination of cefazolin and ceftazidime, 1g/d, both given as continuous dosing. Primary end point: resolution of peritonitis at day 10 (primary treatment response). initial response (day 5), complete cure (relapse/recurrence-free response 28 days after treatment completion), relapsing/recurrent peritonitis, and death from any cause. Noninferiority would be confirmed for the primary outcome if the lower margin of the 1-sided 95% CI was not less than-10% for difference in the primary response rate. A 2-sided 90% CI was used to demonstrate the upper or lower border of the 1-sided 95% CI. There were 144 eligible patients with CAPD-associated peritonitis, of whom 70 and 74 patients were in the monotherapy and combination-therapy groups, respectively. Baseline demographic and clinical characteristics were not different between the groups. The primary response was 82.6% in the monotherapy group and 81.1% in the combination-therapy group (treatment difference, 1.5% 90% CI, -9.1% to 12.1% P=0.04). There was no significant difference in the monotherapy group compared with the combination-therapy group in terms of initial response rate (65.7% vs 60.8% treatment difference, 4.9% 95% CI, -10.8% to 20.6% P=0.5) and complete cure rate (80.0% vs 80.6% treatment difference, -0.6% 95% CI, -13.9% to 12.8% P=0.7). Relapsing and recurrent peritonitis occurred in 4.6% and 4.6% of the monotherapy group and 4.2% and 5.6% of the combination-therapy group (P=0.9and P=0.8, respectively). There was nominally higher all-cause mortality in the monotherapy group (7.1% vs 2.7% treatment difference, 4.4% 95% CI, -2.6% to 11.5%), but this difference was not statistically significant (P = 0.2). Not double blind. IP cefepime monotherapy was noninferior to conventional combination therapy for resolution of CAPD-associated peritonitis at day 10 and may be a reasonable alternative first-line treatment. This study is supported by The Kidney Foundation of Thailand (R5879), Thailand Rachadaphiseksompotch Fund (RA56/006) and Rachadaphicseksompotch Endorsement Fund (CU-GRS_61_06_30_01), Chulalongkorn University, Thailand National Research Council of Thailand (156/2560), Thailand and Thailand Research Foundation (IRG5780017), Thailand. Registered at ClinicalTrials.gov with study number NCT02872038.
Publisher: Elsevier BV
Date: 04-2004
Publisher: Elsevier BV
Date: 05-2021
Publisher: Elsevier BV
Date: 05-2021
Publisher: Wiley
Date: 16-03-2007
DOI: 10.1111/J.1440-1797.2007.00779.X
Abstract: Proximal tubule cells (PTC) are the major cell type in the cortical tubulointerstitium. Because PTC play a central role in tubulointerstitial pathophysiology, it is essential to prepare pure PTC from kidney tissue to explore the mechanisms of tubulointerstitial pathology. The authors have successfully refined and characterized primary cultures of human PTC using Percoll density gradient centrifugation as a key PTC enrichment step. The cells obtained by this method retain morphological and functional properties of PTC and are minimally contaminated by other renal cells. In particular, the primary isolates have characteristics of epithelial cells with uniform polarized morphology, tight junction and well-formed apical microvilli. Cytokeratin is uniformly and strongly expressed in the isolates. Brush border enzyme activities and PTC transport properties are retained in the isolates. This method therefore provides an excellent in vitro model for the physiologic study of the human proximal tubule.
Publisher: Elsevier BV
Date: 05-2021
Publisher: AMPCo
Date: 02-2009
DOI: 10.5694/J.1326-5377.2009.TB02349.X
Abstract: Estimated glomerular filtration rate (eGFR) using the Modification of Diet in Renal Disease formula has been shown to provide unbiased and acceptably accurate estimates of measured GFR across a broad range of in iduals with impaired kidney function. eGFR is superior to measuring serum creatinine (SCr) concentration alone, more accurate than other prediction formulas (such as Cockcroft-Gault) in the setting of reduced kidney function, and more practical and reliable under most circumstances than measuring urinary creatinine clearance. Routine eGFR reporting with requests for SCr, in concert with clinician education, has been shown to enhance the detection of chronic kidney disease (CKD), resulting in improved cardiac and renal outcomes for patients. eGFR has been shown to effectively identify in iduals at increased risk of adverse drug reactions (even when SCr concentration is in the normal range). For most drugs prescribed in primary care and for most patients of average age and body size, drug dosage adjustments based on eGFR should be similar to those based on Cockcroft-Gault. eGFR should not replace Cockcroft-Gault for determining dosage adjustments for critical-dose drugs that have a narrow therapeutic index. eGFR has resulted in important spin-off benefits, such as standardisation of laboratory creatinine assays and enhanced public and clinician awareness of CKD. Clinicians should be aware of the strengths, weaknesses and appropriate use of eGFR. Considerable research effort is being directed towards further refinement of eGFR.
Publisher: Wiley
Date: 26-08-2013
DOI: 10.1111/NEP.12119
Abstract: In response to the increase in Chronic Kidney Disease (CKD) worldwide, several professional organizations have developed clinical practice guidelines to manage and prevent its progression. This study aims to compare the scope, content and consistency of published guidelines on CKD stages I-III. Electronic databases of the medical literature, guideline organizations, and the websites of nephrology societies were searched to November 2011. The Appraisal of Guidelines for Research and Evaluation (AGREE) II instrument and textual synthesis was used to appraise and compare recommendations. One consensus statement and 15 guidelines were identified and included. Methodological rigour across guidelines was variable, with average domain scores ranging from 24% to 95%. For detection of CKD, all guidelines recommended estimated glomerular filtration rate measurement, some also recommended serum creatinine and dipstick urinalysis. The recommended protein and albumin creatinine ratios and proteinuria definition thresholds varied (>150-300 mg/day to >500 mg/day). Blood pressure targets ranged (<125/75 to <140/90 mmHg). Angiotensin converting enzyme inhibitor and angiotensin receptor blockers were recommended for hypertension, as combined or as monotherapy. Protein intake recommendations varied (no restriction or 0.75 g/kg per day-1.0 g/kg per day). Salt intake of 6 g/day was recommended by most. Psychosocial support and education were recommended by few but specific strategies were absent. CKD guidelines were consistent in scope but were variable with respect to their recommendations, coverage and methodological quality. To promote effective primary and secondary prevention of CKD, regularly updated guidelines that are based on the best available evidence and augmented with healthcare context-specific strategies for implementation are warranted.
Publisher: Oxford University Press (OUP)
Date: 07-02-2020
DOI: 10.1093/NDT/GFAA002
Abstract: Home-based dialysis therapies, home hemodialysis (HHD) and peritoneal dialysis (PD) are underutilized in many countries and significant variation in the uptake of home dialysis exists across dialysis centers. This study aimed to evaluate the patient- and center-level characteristics associated with uptake of home dialysis. The Australia and New Zealand Dialysis and Transplant (ANZDATA) Registry was used to include incident dialysis patients in Australia and New Zealand from 1997 to 2017. Uptake of home dialysis was defined as any HHD or PD treatment reported to ANZDATA within 6 months of dialysis initiation. Characteristics associated with home dialysis uptake were evaluated using mixed effects logistic regression models with patient- and center-level covariates, era as a fixed effect and dialysis center as a random effect. Overall, 54 773 patients were included. Uptake of home-based dialysis was reported in 24 399 (45%) patients but varied between 0 and 87% across the 76 centers. Patient-level factors associated with lower uptake included male sex, ethnicity (particularly indigenous peoples), older age, presence of comorbidities, late referral to a nephrology service, remote residence and obesity. Center-level predictors of lower uptake included small center size, smaller proportion of patients with permanent access at dialysis initiation and lower weekly facility hemodialysis hours. The variation in odds of home dialysis uptake across centers increased by 3% after adjusting for the era and patient-level characteristics but decreased by 24% after adjusting for center-level characteristics. Center-specific factors are associated with the variation in uptake of home dialysis across centers in Australia and New Zealand.
Publisher: Springer Science and Business Media LLC
Date: 12-10-2009
Abstract: Adiponectin is a major adipocyte-derived protein with insulin-sensitizing, anti-inflammatory and anti-atherogenic properties. Adiponectin levels correlate inversely with renal function and higher levels are predictive of lower cardiovascular disease (CVD) in patients with normal renal function and chronic kidney disease. No data exists on the association between adiponectin and CVD in renal transplant recipients (RTR). Standard biochemistry, clinical data and adiponectin were collected from 137 RTR recruited to the LANDMARK 2 study at baseline. The LANDMARK 2 study is an ongoing randomized controlled study that compares the outcome of aggressive risk factor modification for cardiovascular disease versus standard post-transplant care in renal transplant recipients with impaired glucose tolerance or diabetes mellitus. Mean patient age was 53.4 ± 12 years and the median post-transplantation period was 5 (0.5-31.9) years. Mean serum adiponectin level was 12.3 ± 7.1 μg/mL. On univariate analysis, adiponectin was positively associated with female gender (P = 0.01) and serum high-density lipoprotein (HDL) concentration (P 0.001), and inversely with body mass index (P = 0.009), metabolic syndrome (P = 0.047), abnormal glucose tolerance (P = 0.01), C-reactive protein (P = 0.001) and serum triglyceride (P 0.001). On stepwise multivariate analysis, adiponectin in males was negatively correlated with combined baseline CVD (P = 0.03), waist-hip ratio (P = 0.003) and glomerular filtration rate (P = 0.046), and positively with HDL (P 0.001). In contrast, in females adiponectin was inversely associated with C-reactive protein (P = 0.001) and serum triglyceride. In conclusion, adiponectin is positively correlated with inflammation, dyslipidemia and abnormal glucose tolerance in RTR. Furthermore, hypoadiponectinemia correlated with increased baseline CVD in male RTR.
Publisher: Elsevier BV
Date: 05-2021
Publisher: Massachusetts Medical Society
Date: 02-04-2009
Publisher: Frontiers Media SA
Date: 2007
DOI: 10.1111/J.1432-2277.2006.00400.X
Abstract: We hypothesized that predictors of outcome in live donor transplants were likely to differ significantly from deceased donor transplants, in which cold ischaemia time, cause of donor death and other donor factors are the most important predictors. The primary aim was to explore the independent predictors of graft function in recipients of live donor kidneys (LDK). Our secondary aim was to determine which donor characteristics are the most useful predictors. A retrospective analysis was undertaken of all patients receiving live donor (n = 206) renal transplants at our institution between 31 May 1994 and 15 October 2002. Twelve patients were excluded from the analysis. Follow-up was completed on all patients until graft loss, death or 22 November 2003. We explored predictors of Nankivell glomerular filtration rate (GFR) at 6 months by multivariate linear regression. In the 194 patients studied, the mean recipient 6-month Nankivell GFR was 59 +/- 15 ml/min/1.73 m(2). Independent predictors of recipient GFR in at 6 months were donor Cockcroft-Gault GFR (CrCl beta 0.16 CI 0.13 to 0.29 P < 0.0001), steroid resistant rejection (beta-6.07 CI -12.05 to -0.09 P = 0.006) and delayed graft function (DGF) (beta-10.0 CI -19.52 to -0.49 P = 0.039). Renal function in an LDK transplant recipients is predicted by donor GFR, episodes of steroid resistant rejection and DGF. Importantly, donor Cockcroft-Gault GFR is the most important characteristic for predicting the recipient renal function.
Publisher: SAGE Publications
Date: 03-2004
DOI: 10.1177/089686080402400209
Abstract: The primary objective of the IDEAL study is to determine whether the timing of dialysis initiation has an effect on survival in subjects with end-stage renal disease (ESRD). The secondary objectives are to determine the impact of “early start” versus “late start” dialysis on nutritional and cardiac morbidity, quality of life, and economic cost. Prospective multicenter randomized controlled trial. Patients are randomized to commence dialysis at a glomerular filtration rate (by Cockcroft–Gault) of either 10 – 14 mL/minute/1.73 m 2 (“early start”) or 5 – 7 mL/min/1.73 m 2 (“late start”), with stratification for dialysis modality (hemodialysis vs peritoneal dialysis), study center, and the presence or not of diabetes mellitus. Dialysis units throughout Australia and New Zealand. Patients with ESRD commencing chronic dialysis therapy. Three years from randomization, all-cause mortality, morbidity, and economic impact structural and functional cardiac status, nutritional state, and quality of life will be assessed. To date, 388 patients of a minimum 800 patients have been entered and randomized into the study. Current recruitment rates suggest sufficient patients will be enrolled by December 2004 and follow-up completed by December 2007. The IDEAL study will provide evidence for the optimal time to commence dialysis.
Publisher: Oxford University Press (OUP)
Date: 26-07-2020
DOI: 10.1093/NDT/GFAA127
Abstract: While peritoneal dialysis (PD) can offer patients more independence and flexibility compared with in-center hemodialysis, managing the ongoing and technically demanding regimen can impose a burden on patients and caregivers. Patient empowerment can strengthen capacity for self-management and improve treatment outcomes. We aimed to describe patients’ and caregivers’ perspectives on the meaning and role of patient empowerment in PD. Adult patients receiving PD (n = 81) and their caregivers (n = 45), purposively s led from nine dialysis units in Australia, Hong Kong and the USA, participated in 14 focus groups. Transcripts were thematically analyzed. We identified six themes: lacking clarity for self-management (limited understanding of rationale behind necessary restrictions, muddled by conflicting information) PD regimen restricting flexibility and freedom (burden in budgeting time, confined to be close to home) strength with supportive relationships (gaining reassurance with practical assistance, comforted by considerate health professionals, supported by family and friends) defying constraints (reclaiming the day, undeterred by treatment, refusing to be defined by illness) regaining lost vitality (enabling physical functioning, restoring energy for life participation) and personal growth through adjustment (building resilience and enabling positive outlook, accepting the dialysis regimen). Understanding the rationale behind lifestyle restrictions, practical assistance and family support in managing PD promoted patient empowerment, whereas being constrained in time and capacity for life participation outside the home undermined it. Education, counseling and strategies to minimize the disruption and burden of PD may enhance satisfaction and outcomes in patients requiring PD.
Publisher: Elsevier BV
Date: 12-2021
Publisher: Springer Science and Business Media LLC
Date: 12-1998
Abstract: In order to examine the nature and potential mechanisms of action of extracellular sodium on human proximal tubule growth and transport, quiescent primary cultures of human proximal tubule cells (PTC) were incubated for 24 h in serum-free, growth-factor-free culture media containing low (130 mmol/l), control (140 mmol/l) or high (150 mmol/l) Na+. Compared to control conditions, cells exposed to a high Na+ concentration demonstrated stimulated thymidine incorporation (121.8 +/- 7.6%, P < 0.05) and increased cellular protein content (139.7 +/- 9.9%, P < 0.05) the latter arising from suppressed protein degradation ([3H]valine release 72.3 +/- 2.5%, P 0.1). Substitution of choline chloride for NaCl did not replicate these effects. Conversely, cells incubated in low-Na+ media showed reduced thymidine incorporation (77.2 +/- 4.4%, P < 0. 05), reduced protein synthesis (60.6 +/- 4.3%, P < 0.01), reduced protein degradation (79.5 +/- 1.8%, P < 0.01) and an unaltered protein content (102.4 +/- 8.8%). A role for apical Na+/H+ exchange (NHE) activity in mediating Na+-dependent alterations in PTC growth was suggested by the findings of increased apical, ethylisopropylamiloride- (EIPA)-sensitive 22Na+ uptake in the presence of a high Na+ concentration (159 +/- 19% of control, P 0.1) and transforming growth factor-beta1 (1.76 +/- 0.32, 1.73 +/- 0.33 and 1.45 +/- 0.28 ng/mg protein, P > 0.1), and did not exhibit autocrine growth factor activity on separate PTC following adjustment of Na+ concentrations to 140 mmol/l by dialysis. Similarly, low-Na+, control or high-Na+ media did not modify the mitogenic responsiveness of PTC to insulin-like growth factor-I (IGF-I) or alter the affinity or number of PTC IGF-I binding sites. The results confirm that physiological increases in extracellular Na+ concentration directly stimulate human proximal tubule growth and Na+ transport. Such stimulation does not appear to be mediated by altered PTC secretion of, or responsiveness to, cytokines known to affect tubule growth and transport.
Publisher: Elsevier BV
Date: 07-2002
DOI: 10.1046/J.1523-1755.2002.00401.X
Abstract: Tubulointerstitial lesions, characterized by tubular injury, interstitial fibrosis and the appearance of myofibroblasts, are the strongest predictors of the degree and progression of chronic renal failure. These lesions are typically preceded by macrophage infiltration of the tubulointerstitium, raising the possibility that these inflammatory cells promote progressive renal disease through fibrogenic actions on resident tubulointerstitial cells. The aim of the present study, therefore, was to investigate the potentially fibrogenic mechanisms of interleukin-1beta (IL-1beta), a macrophage-derived pro-inflammatory cytokine, on human proximal tubule cells (PTC). Confluent, quiescent, passage 2 PTC were established in primary culture from histologically normal segments of human renal cortex (N = 11) and then incubated in serum- and hormone-free media supplemented with either IL-1beta (0 to 4 ng/mL) or vehicle (control). IL-1beta significantly enhanced fibronectin secretion by up to fourfold in a time- and concentration-dependent fashion. This was accompanied by significant (2.5- to 6-fold) increases in alpha-smooth muscle actin (alpha-SMA) expression, transforming growth factor beta (TGF-beta1) secretion, nitric oxide (NO) production, NO synthase 2 (NOS2) mRNA and lactate dehydrogenase (LDH) release. Cell proliferation was dose-dependently suppressed by IL-1beta. NG-methyl-l-arginine (L-NMMA 1 mmol/L), a specific inhibitor of NOS, blocked NO production but did not alter basal or IL-1beta-stimulated fibronectin secretion. In contrast, a pan-specific TGF-beta neutralizing antibody significantly blocked the effects of IL-1beta on PTC fibronectin secretion (IL-1beta, 268.1 +/- 30.6 vs. IL-1beta+alphaTGF-beta 157.9 +/- 14.4%, of control values, P < 0.001) and DNA synthesis (IL-1beta 81.0 +/- 6.7% vs. IL-1beta+alphaTGF-beta 93.4 +/- 2.1%, of control values, P < 0.01). IL-1beta acts on human PTC to suppress cell proliferation, enhance fibronectin production and promote alpha-smooth muscle actin expression. These actions appear to be mediated by a TGF-beta1 dependent mechanism and are independent of nitric oxide release.
Publisher: Springer Science and Business Media LLC
Date: 06-09-2018
DOI: 10.1038/S41598-018-31612-1
Abstract: Antibiotic resistance is a major global health threat. High prevalences of colonization and infection with multi-drug resistance organisms (MDROs) have been reported in patients undergoing dialysis. It is unknown if this finding extends to patients with mild and moderate/severe kidney disease. An observational study included all adult incident patients hospitalized with a discharge diagnosis of infection in four hospitals from Guangzhou, China. Inclusion criteria: Serum creatinine measurement at admission together with microbial culture confirmed infections. Exclusion criterion: Undergoing renal replacement therapy. Four categories of Chronic Kidney Disease Epidemiology Collaboration (CKD-EPI) estimated glomerular filtration rate (eGFR) were compared: eGFR ≥ 105, 60–104 (reference), 30–59, and ml/min/1.73 m 2 . The odds ratio of MDROs, defined as specific pathogens ( Staphylococcus aureu s, Enterococcus spp ., Enterobacteriaceae , Pseudomonas aeruginosa and Acinetobacter spp .) resistant to three or more antibiotic classes, were calculated using a multivariable logistic regression model across eGFR strata. Of 94,445 total microbial culture records, 7,288 first positive cultures matched to infection diagnosis were selected. Among them, 5,028 (68.9%) were potential MDROs. The odds of infections by MDROs was 19% and 41% higher in those with eGFR between 30–59 ml/min/1.73 m 2 (Adjusted odds ratio, AOR): 1.19, 95% CI:1.02–1.38, P = 0.022) and eGFR 30 ml/min/1.73 m 2 (AOR: 1.41, 95% CI:1.12–1.78, P = 0.004), respectively. Patients with impaired renal function have a higher risk of infections by MDROs. Kidney dysfunction at admission may be an indicator for need of closer attention to microbial culture results requiring subsequent change of antibiotics.
Publisher: Wiley
Date: 16-12-2019
DOI: 10.1111/NEP.13221
Abstract: Differences in early graft function between kidney transplant recipients previously managed with either haemodialysis (HD) or peritoneal dialysis are well described. However, only two single-centre studies have compared graft and patient outcomes between extended hour and conventional HD patients, with conflicting results. This study compared the outcomes of all extended hour (≥24 h/week) and conventional HD patients transplanted in Australia and New Zealand between 2000 and 2014. The primary outcome was delayed graft function (DGF), defined in an ordinal manner as either a spontaneous fall in serum creatinine of less than 10% within 24 h, or the need for dialysis within 72 h following transplantation. Secondary outcomes included the requirement for dialysis within 72 h post-transplant, acute rejection, estimated glomerular filtration rate at 12 months, death-censored graft failure, all-cause and cardiovascular mortality, and a composite of graft failure and mortality. A total of 4935 HD patients (378 extended hour HD, 4557 conventional HD) received a kidney transplant during the study period. Extended hour HD was associated with an increased likelihood of DGF compared with conventional HD (adjusted proportional odds ratio 1.33 95% confidence interval 1.06-1.67). There was no significant difference between extended hour and conventional HD in terms of any of the secondary outcomes. Compared to conventional HD, extended hour HD was associated with DGF, although long-term graft and patient outcomes were not different.
Publisher: Elsevier BV
Date: 02-2009
DOI: 10.1053/J.AJKD.2008.06.032
Abstract: The aim of the present investigation is to compare rates, types, causes, and timing of infectious death in incident peritoneal dialysis (PD) and hemodialysis (HD) patients in Australia and New Zealand. Observational cohort study using the Australian and New Zealand Dialysis and Transplant Registry data. The study included all patients starting dialysis therapy between April 1, 1995, and December 31, 2005. Dialysis modality. Rates of and time to infectious death were compared by using Poisson regression, Kaplan-Meier, and competing risks multivariate Cox proportional hazards model analyses. 21,935 patients started dialysis therapy (first treatment PD, n = 6,020 HD, n = 15,915) during the study period, and 1,163 patients (5.1%) died of infectious causes (PD, 529 patients 7.6% versus HD, 634 patients 4.2%). Incidence rates of infectious mortality in PD and HD patients were 2.8 and 1.7/100 patient-years, respectively (incidence rate ratio PD versus HD, 1.66 95% confidence interval [CI], 1.47 to 1.86). After performing competing risks multivariate Cox analyses allowing for an interaction between time on study and modality because of identified nonproportionality of hazards, PD consistently was associated with increased hazard of death from infection compared with HD after 6 months of treatment ( 6 years HR, 2.76 95% CI, 1.76 to 4.33). This increased risk of infectious death in PD patients was largely accounted for by an increased risk of death caused by bacterial or fungal peritonitis. Patients were not randomly assigned to their initial dialysis modality. Residual confounding and coding bias could not be excluded. Dialysis modality selection significantly influences risks, types, causes, and timing of fatal infections experienced by patients with end-stage kidney disease in Australia and New Zealand.
Publisher: Wiley
Date: 25-11-2020
Publisher: Elsevier BV
Date: 06-2010
DOI: 10.1016/J.BIOCEL.2009.11.013
Abstract: The classical action of androgen receptor (AR) is to regulate gene transcriptional processes via AR nuclear translocation, response element binding and recruitment of, or crosstalk with, transcription factors. AR also utilises non-classical, non-genomic mechanisms of signal transduction. These precede gene transcription or protein synthesis, and involve steroid-induced modulation of cytoplasmic or cell membrane-bound regulatory proteins. Despite many decades of investigation, the role of AR in gene regulation of cells and tissues remains only partially characterised. AR exerts most of its effects in sex hormone-dependent tissues of the body, but the receptor is also expressed in many tissues not previously thought to be androgen sensitive. Thus it is likely that a complex, more over-arching, role for AR exists. Each AR domain co-ordinates a multitude of in idual and vital roles via a erse array of interacting partner molecules that are necessary for cellular and tissue development and maintenance. Aberrant AR activity, promoted by mutations or binding partner misregulation, can present as many clinical manifestations including androgen insensitivity syndrome and prostate cancer. In the case of malignant prostate cancer, treatment generally revolves around androgen deprivation therapies designed to interfere with AR action and the androgen signalling axis. Androgen therapies for prostate cancer often fail, highlighting a real need for increased research into AR function.
Publisher: SAGE Publications
Date: 11-11-2020
Abstract: Peritoneal dialysis (PD) can offer patients more autonomy and flexibility compared with in-center hemodialysis (HD). However, burnout – defined as mental, emotional, or physical exhaustion that leads to thoughts of discontinuing PD – is associated with an increased risk of transfer to HD. We aimed to describe the perspectives of burnout among patients on PD and their caregivers. In this focus group study, 81 patients and 45 caregivers participated in 14 focus groups from 9 dialysis units in Australia, Hong Kong, and the United States. Transcripts were analyzed thematically. We identified two themes. Suffering an unrelenting responsibility contributed to burnout, as patients and caregivers felt overwhelmed by the daily regimen, perceived their life to be coming to a halt, tolerated the PD regimen for survival, and had to bear the burden and uncertainty of what to expect from PD alone. Adapting and building resilience against burnout encompassed establishing a new normal, drawing inspiration and support from family, relying on faith and hope for motivation, and finding meaning in other activities. For patients on PD and their caregivers, burnout was intensified by perceiving PD as an unrelenting, isolating responsibility that they had no choice but to endure, even if it held them back from doing other activities in life. More emphasis on developing strategies to adapt and build resilience could prevent or minimize burnout.
Publisher: SAGE Publications
Date: 1996
DOI: 10.1177/089686089601600110
Abstract: The aims of this study were to assess the clinical utility of total and regional bone densitometry in a large continuous ambulatory peritoneal dialysis (CAPD) population and to determine the clinical, biochemical, and radiographic variables that best identified osteopenic CAPD patients. A cross-sectional study was performed on 45 CAPD patients (19 males, 26 females), comprising the total CAPD population at the Princess Alexandra Hospital. Total body (TB), anteroposterior lumbar spine (APL), femoral neck (FN), Ward's triangle (WT), and skull bone mineral densities (BMDs) were measured using dual-energy x-ray absorptiometry (DEXA) and then correlated with clinical, biochemical, and radiographic indices of uremic osteodystrophy. BMDs were not significantly different from age and sex-matched reference population data. Considerable regional variation of BMD Z scores were noted between FN (-0.11 ± 0.23), WT (-0.11 ± 0.22), and APL (1.22 ± 0.04) (p = 0.003). APLZ scores were significantly reduced in patients with a previous history of fracture (-1.36 ± 1.07vs0.89 ± 0.31), bonepain(-0.72 ± 1.08vs 1.01 ± 0.31), or steroid treatment (-0.62 ± 0.39 vs 1.16 ± 0.35). Increased BMDZ scores for APL (1.82 ± 0.57 vs0.38 ± 0.29, p 0.05), FN (0.32 ± 0.36vs-0.38 ± 0.29, p =0.014), and WT (0.45 ± 0.38vs-0.45 ± 0.26, p .05) were found in patients with radiographic hyperparathyroid bone disease. Both APL BMD Z scores and skull BMDs were weakly correlated with PTH (r = -0.33, p 0.05 and r = -0.33, p 0.05, respectively) and with CAPD duration (r = 0.30, p 0.05 and r = -0.30, p 0.05). Generally, however, total body and regional BMDs were poorly related to age, renal disease type, dialysis duration, renal failure duration, serum aluminum, calcium, phosphate, alkaline phosphatase, osteocalcin, and parathyroid hormone. We conclude that the prevalence of osteopenia is not increased in CAPD patients. Clinical and biochemical parameters do not reliably predict BMD measurements, but prior steroids and bone symptoms are major risk factors for important bone loss. Although DEXA can reliably detect osteopenia in different skeletal regions, its usefulness in detecting osteodystrophy is limited by the confounding effects of superimposed hyperparathyroid osteosclerosis, which increases BMD.
Publisher: Dustri-Verlgag Dr. Karl Feistle
Date: 2020
DOI: 10.5414/CNP92S104
Publisher: BMJ
Date: 02-2019
DOI: 10.1136/BMJOPEN-2018-024382
Abstract: Patients with chronic kidney disease (CKD) are at heightened cardiovascular risk, which has been associated with abnormalities of bone and mineral metabolism. A deeper understanding of these abnormalities should facilitate improved treatment strategies and patient-level outcomes, but at present there are few large, randomised controlled clinical trials to guide management. Positive associations between serum phosphate and fibroblast growth factor 23 (FGF-23) and cardiovascular morbidity and mortality in both the general and CKD populations have resulted in clinical guidelines suggesting that serum phosphate be targeted towards the normal range, although few randomised and placebo-controlled studies have addressed clinical outcomes using interventions to improve phosphate control. Early preventive measures to reduce the development and progression of vascular calcification, left ventricular hypertrophy and arterial stiffness are crucial in patients with CKD. We outline the rationale and protocol for an international, multicentre, randomised parallel-group trial assessing the impact of the non-calcium-based phosphate binder, lanthanum carbonate, compared with placebo on surrogate markers of cardiovascular disease in a predialysis CKD population—the IM pact of P hosphate R eduction O n V ascular E nd-points (IMPROVE)-CKD study. The primary objective of the IMPROVE-CKD study is to determine if the use of lanthanum carbonate reduces the burden of cardiovascular disease in patients with CKD stages 3b and 4 when compared with placebo. The primary end-point of the study is change in arterial compliance measured by pulse wave velocity over a 96-week period. Secondary outcomes include change in aortic calcification and biochemical parameters of serum phosphate, parathyroid hormone and FGF-23 levels. Ethical approval for the IMPROVE-CKD trial was obtained by each local Institutional Ethics Committee for all 17 participating sites in Australia, New Zealand and Malaysia prior to study commencement. Results of this clinical trial will be published in peer-reviewed journals and presented at conferences. ACTRN12610000650099.
Publisher: Springer Science and Business Media LLC
Date: 30-01-2013
Publisher: Oxford University Press (OUP)
Date: 2005
DOI: 10.1093/NDT/GFH580
Publisher: MDPI AG
Date: 15-12-2021
DOI: 10.3390/NU13124481
Abstract: Synbiotics have emerged as a therapeutic strategy for modulating the gut microbiome and targeting novel cardiovascular risk factors, including uremic toxins indoxyl sulfate (IS) and p-cresyl sulfate (PCS). This study aims to evaluate the feasibility of a trial of long-term synbiotic supplementation in adults with stage 3–4 chronic kidney disease (CKD). Adult participants with CKD and estimated glomerular filtration rate (eGFR) of 15–60 mL/min/1.73 m2) were recruited between April 2017 and August 2018 to a feasibility, double-blind, placebo-controlled, randomized trial of synbiotic therapy or matched identical placebo for 12 months. The primary outcomes were recruitment and retention rates as well as acceptability of the intervention. Secondary outcomes were treatment adherence and dietary intake. Exploratory outcomes were evaluation of the cardiovascular structure and function, serum IS and PCS, stool microbiota profile, kidney function, blood pressure, and lipid profile. Of 166 potentially eligible patients, 68 (41%) were recruited into the trial (synbiotic n = 35, placebo n = 33). Synbiotic and placebo groups had acceptable and comparable 12-month retention rates (80% versus 85%, respectively, p = 0.60). Synbiotic supplementation altered the stool microbiome with an enrichment of Bifidobacterium and Blautia spp., resulting in a 3.14 mL/min/1.73 m2 (95% confidence interval (CI), −6.23 to −0.06 mL/min/1.73 m2, p 0.01) reduction in eGFR and a 20.8 µmol/L (95% CI, 2.97 to 38.5 µmol/L, p 0.01) increase in serum creatinine concentration. No between-group differences were observed in any of the other secondary or exploratory outcomes. Long-term synbiotic supplementation was feasible and acceptable to patients with CKD, and it modified the gastrointestinal microbiome. However, the reduction in kidney function with synbiotics warrants further investigation.
Publisher: SAGE Publications
Date: 2020
Abstract: The outcomes of culture-negative peritonitis in peritoneal dialysis (PD) patients have been reported to be superior to those of culture-positive peritonitis. The current study aimed to examine whether this observation also applied to different subtypes of culture-positive peritonitis. This multicentre registry study included all episodes of peritonitis in adult PD patients in Australia between 2004 and 2014. The primary outcome was medical cure. Secondary outcomes were catheter removal, hemodialysis transfer, relapsing/recurrent peritonitis and peritonitis-related death. These outcomes were analyzed using mixed effects logistic regression. Overall, 11,122 episodes of peritonitis occurring in 5367 patients were included. A total of 1760 (16%) episodes were culture-negative, of which 77% were medically cured. Compared with culture-negative peritonitis, the odds of medical cure were lower in peritonitis caused by Staphylococcus aureus (adjusted odds ratio (OR) 0.62, 95% confidence interval (CI) 0.52–0.73), Pseudomonas species (OR 0.20, 95% CI 0.16–0.26), other gram-negative organisms (OR 0.48, 95% CI 0.41–0.56), polymicrobial organisms (OR 0.30, 95% CI 0.25–0.35), fungi (OR 0.02, 95% CI 0.01–0.03), and other organisms (OR 0.61, 95% CI 0.49–0.76), while the odds were similar in other (non-staphylococcal) gram-positive organisms (OR 1.11, 95% CI 0.97–1.28). Similar results were observed for catheter removal and hemodialysis transfer. Compared with culture-negative peritonitis, peritonitis-related mortality was significantly higher in culture-positive peritonitis except that due to other gram-positive organisms. There was no difference in the odds of relapsing/recurrent peritonitis between culture-negative and culture-positive peritonitis. Culture-negative peritonitis had superior outcomes compared to culture-positive peritonitis except for non-staphylococcal gram-positive peritonitis.
Publisher: Oxford University Press (OUP)
Date: 2001
DOI: 10.1093/NDT/16.1.197
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 21-06-2017
DOI: 10.2215/CJN.12321216
Abstract: Technique failure is a major limitation of peritoneal dialysis. Our study aimed to identify center- and patient-level predictors of peritoneal dialysis technique failure. All patients on incident peritoneal dialysis in Australia from 2004 to 2014 were included in the study using data from the Australia and New Zealand Dialysis and Transplant Registry. Center- and patient-level characteristics associated with technique failure were evaluated using Cox shared frailty models. Death-censored technique failure and cause-specific technique failure were analyzed as secondary outcomes. The study included 9362 patients from 51 centers in Australia. The technique failure rate was 0.35 (95% confidence interval, 0.34 to 0.36) episodes per patient-year, with a sevenfold variation across centers that was mainly associated with center-level characteristics. Technique failure was significantly less likely in centers with larger proportions of patients treated with peritoneal dialysis ( % adjusted hazard ratio, 0.83 95% confidence interval, 0.73 to 0.94) and more likely in smaller centers ( new patients per year adjusted hazard ratio, 1.10 95% confidence interval, 1.00 to 1.21) and centers with lower proportions of patients achieving target baseline serum phosphate levels ( % adjusted hazard ratio, 1.15 95% confidence interval, 1.03 to 1.29). Similar results were observed for death-censored technique failure, except that center target phosphate achievement was not significantly associated. Technique failure due to infection, social reasons, mechanical causes, or death was variably associated with center size, proportion of patients on peritoneal dialysis, and/or target phosphate achievement, automated peritoneal dialysis exposure, icodextrin use, and antifungal use. The variation of hazards of technique failure across centers was reduced by 28% after adjusting for patient-specific factors and an additional 53% after adding center-specific factors. Technique failure varies widely across centers in Australia. A significant proportion of this variation is related to potentially modifiable center characteristics, including peritoneal dialysis center size, proportion of patients on peritoneal dialysis, and proportion of patients on peritoneal dialysis achieving target phosphate level.
Publisher: Elsevier BV
Date: 05-2011
DOI: 10.1016/J.JCHROMB.2010.12.010
Abstract: The further development of derivatizing reagents for plasma amino acid quantification by tandem mass spectrometry is described. The succinimide ester of 4-methylpiperazineacetic acid (MPAS), the iTRAQ reagent, was systematically modified to improve tandem mass spectrometer (MS/MS) product ion intensity. 4-Methylpiperazinebutyryl succinimide (MPBS) and dimethylaminobutyryl succinimide (DMABS) afforded one to two orders of magnitude greater MS/MS product ion signal intensity than the MPAS derivative for simple amino acids. CD(3) analogues of the modified derivatizing reagents were evaluated for preparation of amino acid isotope-labelled quantifying standards. Acceptable accuracy and precision was obtained with d(3)-DMABS as the amino acid standards derivatizing reagent. The product ion spectra of the DMABS amino acid derivatives are diagnostic for structural isomers including valine/norvaline, alanine/sarcosine and leucine/isoleucine. Improved analytical sensitivity and specificity afforded by these derivatives may help to establish liquid chromatography tandem mass spectrometry (LC-MS/MS) with derivatization generated isotope-labelled standards a viable alternative to amino acids analysers.
Publisher: Wiley
Date: 12-2008
DOI: 10.1002/J.2055-2335.2008.TB00389.X
Abstract: To review acute renal failure in patients on one or more target drugs – diuretics, non‐steroidal anti‐inflammatory drugs (NSAID), cyclo‐oxygenase‐2 (COX‐2) inhibitors, angiotensin converting enzyme inhibitors (ACEI) and angiotensin receptor antagonists (ARA). Retrospective review of drug charts of patients admitted to hospital with acute renal failure from 2003 to 2005. Patients taking one or more of the target drugs prior to admission were identified. Data were collected on patient demographics, target drugs, other factors contributing to acute renal failure, medical history, baseline and peak creatinine concentration, and clinical outcome. The baseline and lowest estimated glomerular filtration rate (eGFR) were calculated. Of the 289 patients admitted with acute renal failure, 108 (37%) were taking one or more of the target drugs. The mean age was 73 years and 84 patients (79%) had a baseline eGFR of = 60 mL/min. There was a marked decline in kidney function from a mean baseline eGFR of 50 mL/min to 17 mL/min (60% decline). 44% of patients were dehydrated, 12% were septic and 60% had more than 4 comorbid diseases. Only 6 patients were on more than 3 target drugs. The decline in eGFR exceeded 50% with each drug/drug combination and except for ACEI and ARA (withheld in 50% of cases) other drugs/drug combinations were withheld in most patients. Acute renal failure was reversible after treatment of the underlying illness and withdrawal of the target drugs in the majority of patients. Over a third of patients with acute renal failure were taking diuretics, NSAID/COX‐2 inhibitors, ACEI/ARA, which may have played a causal or contributory role in the decline in their renal function. Other risk factors for developing acute renal failure included dehydration, sepsis, multiple comorbidities and older age.
Publisher: Elsevier BV
Date: 2017
DOI: 10.1016/J.SEMNEPHROL.2016.10.005
Abstract: Long-term exposure to a high glucose concentration in conventional peritoneal dialysis (PD) solution has a number of direct and indirect (via glucose degradation products [GDP]) detrimental effects on the peritoneal membrane, as well as systemic metabolism. Glucose- or GDP-sparing strategies often are hypothesized to confer clinical benefits to PD patients. Icodextrin (glucose polymer) solution improves peritoneal ultrafiltration and reduces the risk of fluid overload, but these beneficial effects are probably the result of better fluid removal rather than being glucose sparing. Although frequently used for glucose sparing, the role of amino acid-based solution in this regard has not been tested thoroughly. When glucose-free solutions are used in a combination regimen, published studies showed that glycemic control was improved significantly in diabetic PD patients, and there probably are beneficial effects on peritoneal function. However, the long-term effects of glucose-free solutions, used either alone or as a combination regimen, require further studies. On the other hand, neutral pH-low GDP fluids have been shown convincingly to preserve residual renal function and urine volume. The cost effectiveness of these solutions supports the regular use of neutral pH-low GDP solutions. Nevertheless, further studies are required to determine whether neutral pH-low GDP solutions exert beneficial effects on patient-level outcomes, such as peritonitis, technique survival, and patient survival.
Publisher: S. Karger AG
Date: 2020
DOI: 10.1159/000505567
Abstract: b i Background: /i /b A new class of dialysis membrane, the mid cut-off (MCO) dialyzer, has been developed to improve the clearance of uremic toxins in hemodialysis (HD). The a tRial Evaluating Mid cut-Off Value membrane clearance of Albumin and Light chains in HemoDialysis patients (REMOVAL-HD) study aimed to determine if regular use of MCO dialyzer was safe and specifically did not result in a significant loss of albumin. b i Methods: /i /b This investigator initiated, crossover, longitudinal, device study was conducted across 9 centers in Australia and New Zealand ( i n /i = 89). Participants had a 4-week wash-in with high-flux HD, followed by 24-week intervention with MCO HD and a subsequent 4-week wash-out with high-flux HD. The primary outcome was change in serum albumin between weeks 4 and 28. Secondary outcomes included trends in serum albumin, changes in kappa- and lambda-free light chains (FLC), 6-min walk test (6MWT), malnutrition inflammation score (MIS), restless legs score and quality of life. b i Results: /i /b Participants had a mean age of 66 ± 14 years, 62% were men, 45% were anuric, and 51% had diabetes. There was no reduction in serum albumin following treatment with MCO HD (mean reduction –0.7 g/L, 95% CI –1.5 to 0.1). A sustained, unexplained reduction in serum albumin (& #x3e %) was not observed in any participant. A reduction in FLC was observed 2 weeks into MCO HD (lambda-FLC: Δ –9.1 mg/L, 95% CI –14.4 to –3.7 kappa-FLC: Δ –5.7 mg/L, 95% CI –9.8 to –1.6) and was sustained for the rest of the study intervention. Both FLC increased after the cessation of MCO use. There was no improvement in restless legs symptoms, quality of life, 6MWT or MIS scores. b i Conclusions: /i /b Regular HD using the MCO dialyzer did not result in a significant fall in serum albumin. There were no effects on quality of life, functional status or nutrition. b i Trial Registration: /i /b Australian New Zealand Clinical Trials Registry Number (ANZCTRN) 12616000804482.
Publisher: Elsevier BV
Date: 02-2201
DOI: 10.1016/J.MEHY.2011.11.014
Abstract: Renal cell carcinoma (RCC) is the commonest of the renal neoplasms. Although surgery and cryoablation are successful curative treatments for localized RCC, most patients are diagnosed with advanced or metastatic RCC, which has a poor prognosis. RCC are a heterogeneous set of cancers that have traditionally been classified and staged using cellular characteristics, size, local extension and distant metastases. Current staging systems provide good prognostic information, but it is very likely that the identification of new more accurate and predictive prognostic markers, not currently included in traditional staging systems, will improve the outcome for RCC patients. For this reason, increased knowledge of the underlying molecular characteristics of RCC development and progression is necessary. In most cancers, but especially RCC, deregulated control of apoptosis contributes to cancer growth by aberrantly extending cell viability and facilitating resistance to cancer therapies. Here we present the hypothesis that select members of the tumor necrosis factor (TNF) superfamily, the TNF receptor-associated factors (TRAFs), have a role in RCC apoptosis and may have prognostic significance for RCC. Candidate biomarkers for RCC are few, and the TRAFs may be important inclusions in panels of biomarkers for RCC. TRAFs may also be potential molecular targets for new therapies, either through their ability to promote apoptosis in the cancers themselves, or through their ability to modulate the immune defence against cancer progression. Some support data are presented here for our hypothesis. However, these novel concepts need further careful analysis to allow clinicians and oncologists any assistance for earlier detection of RCC and for characterizing patients with RCC for in idualised targeted therapy.
Publisher: SAGE Publications
Date: 07-2010
Publisher: MDPI AG
Date: 02-11-2020
DOI: 10.3390/NU12113376
Abstract: The excess intake of dietary sodium is a key modifiable factor for reducing disease progression in autosomal dominant polycystic kidney disease (ADPKD). The aim of this study was to test the hypothesis that the scored salt questionnaire (SSQ a frequency questionnaire of nine sodium-rich food types) is a valid instrument to identify high dietary salt intake in ADPKD. The performance of the SSQ was evaluated in adults with ADPKD with an estimated glomerular filtration rate (eGFR) ≥ 30 mL/min/1.73 m2 during the screening visit of the PREVENT-ADPKD trial. High dietary sodium intake (HSI) was defined by a mean 24-h urinary sodium excretion ≥ 100 mmol/day from two collections. The median 24-h urine sodium excretion was 132 mmol/day (IQR: 112–172 mmol/d) (n = 75 mean age: 44.6 ± 11.5 years old 53% female), and HSI (86.7% of total) was associated with male gender and higher BMI and systolic blood pressure (p 0.05). The SSQ score (73 ± 23 mean ± SD) was weakly correlated with log10 24-h urine sodium excretion (r = 0.29, p = 0.01). Receiving operating characteristic analysis showed that the optimal cut-off point in predicting HSI was an SSQ score of 74 (area under the curve 0.79 sensitivity 61.5% specificity 90.0% p 0.01). The evaluation of the SSQ in participants with a BMI ≥ 25 (n = 46) improved the sensitivity (100%) and the specificity (100%). Consumers with an SSQ score ≥ 74 (n = 41) had higher relative percentage intake of processed meats/seafood and flavourings added to cooking (p 0.05). In conclusion, the SSQ is a valid tool for identifying high dietary salt intake in ADPKD but its value proposition (over 24-h urinary sodium measurement) is that it may provide consumers and their healthcare providers with insight into the potential origin of sodium-rich food sources.
Publisher: Springer Science and Business Media LLC
Date: 26-07-2009
Abstract: There has not been a comprehensive, multi-centre study of streptococcal peritonitis in patients on peritoneal dialysis (PD) to date. The predictors, treatment and clinical outcomes of streptococcal peritonitis were examined by binary logistic regression and multilevel, multivariate poisson regression in all Australian PD patients involving 66 centres between 2003 and 2006. Two hundred and eighty-seven episodes of streptococcal peritonitis (4.6% of all peritonitis episodes) occurred in 256 in iduals. Its occurrence was independently predicted by Aboriginal or Torres Strait Islander racial origin. Compared with other organisms, streptococcal peritonitis was associated with significantly lower risks of relapse (3% vs 15%), catheter removal (10% vs 23%) and permanent haemodialysis transfer (9% vs 18%), as well as a shorter duration of hospitalisation (5 vs 6 days). Overall, 249 (87%) patients were successfully treated with antibiotics without experiencing relapse, catheter removal or death. The majority of streptococcal peritonitis episodes were treated with either intraperitoneal vancomycin (most common) or first-generation cephalosporins for a median period of 13 days (interquartile range 8–18 days). Initial empiric antibiotic choice did not influence outcomes. Streptococcal peritonitis is a not infrequent complication of PD, which is more common in indigenous patients. When treated with either first-generation cephalosporins or vancomycin for a period of 2 weeks, streptococcal peritonitis is associated with lower risks of relapse, catheter removal and permanent haemodialysis transfer than other forms of PD-associated peritonitis.
Publisher: Elsevier BV
Date: 09-2016
DOI: 10.1053/J.AJKD.2016.02.037
Abstract: In the context of clinical research, investigators have historically selected the outcomes that they consider to be important, but these are often discordant with patients' priorities. Efforts to define and report patient-centered outcomes are gaining momentum, though little work has been done in nephrology. We aimed to identify patient and caregiver priorities for outcomes in hemodialysis. Nominal group technique. Patients on hemodialysis therapy and their caregivers were purposively s led from 4 dialysis units in Australia (Sydney and Melbourne) and 7 dialysis units in Canada (Calgary). Identification and ranking of outcomes. Mean rank score (of 10) for top 10 outcomes and thematic analysis. 82 participants (58 patients, 24 caregivers) aged 24 to 87 (mean, 58.4) years in 12 nominal groups identified 68 outcomes. The 10 top-ranked outcomes were fatigue/energy (mean rank score, 4.5), survival (defined by patients as resilience and coping 3.7), ability to travel (3.6), dialysis-free time (3.3), impact on family (3.2), ability to work (2.5), sleep (2.3), anxiety/stress (2.1), decrease in blood pressure (2.0), and lack of appetite/taste (1.9). Mortality ranked only 14th and was not regarded as the complement of survival. Caregivers ranked mortality, anxiety, and depression higher than patients, whereas patients ranked ability to work higher. Four themes underpinned their rankings: living well, ability to control outcomes, tangible and experiential relevance, and severity and intrusiveness. Only English-speaking participants were eligible. Although trials in hemodialysis have typically focused on outcomes such as death, adverse events, and biological markers, patients tend to prioritize outcomes that are more relevant to their daily living and well-being. Researchers need to consider interventions that are likely to improve these outcomes and measure and report patient-relevant outcomes in trials, and clinicians may become more patient-orientated by using these outcomes in their clinical encounters.
Publisher: Wiley
Date: 20-11-2014
Publisher: American Physiological Society
Date: 11-2007
DOI: 10.1152/AJPRENAL.00088.2007
Abstract: Despite the abundant expression of protease-activated receptor (PAR)-2 in the kidney, its relevance to renal physiology is not well understood. A role for this receptor in inflammation and cell proliferation has recently been suggested in nonrenal tissues. The aims of this study were to demonstrate that human proximal tubule cells (PTC) express functional PAR-2 and to investigate whether its activation can mediate proinflammatory and proliferative responses in these cells. Primary human PTC were cultured under serum-free conditions with or without the PAR-2-activating peptide SLIGKV-NH 2 (up to 800 μM), a control peptide, VKGILS-NH 2 (200 μM), or trypsin (0.01–100 nM). PAR-2 expression (RT-PCR), intracellular Ca 2+ mobilization (fura-2 fluorimetry), DNA synthesis (thymidine incorporation), fibronectin production (ELISA, Western blotting), and monocyte chemotactic protein (MCP)-1 secretion (ELISA) were measured. Trypsinogen expression in kidney and PTC cultures was determined by immunohistochemistry and Western blotting. In the kidney PTC were the predominant cell type expressing PAR-2. SLIGKV-NH 2 , but not VKGILS-NH 2 , stimulated a rapid concentration-dependent mobilization of intracellular Ca 2+ and ERK1/2 phosphorylation and, by 24 h, increases in DNA synthesis, fibronectin secretion, and MCP-1 secretion. These delayed responses appeared to be independent of ERK1/2. Trypsin produced similar rapid but not delayed responses. Trypsinogen was weakly expressed by PTC in the kidney and in culture. In summary, PTC are the main site of PAR-2 expression in the human kidney. In PTC cultures SLIGKV-NH 2 initiates proinflammatory and proliferative responses. Trypsinogen expressed within the kidney has the potential to contribute to PAR-2 activation in certain circumstances.
Publisher: Wiley
Date: 15-10-2020
DOI: 10.1111/NEP.13670
Abstract: Infectious complications are common following kidney transplantation and rank in the top five causes of death in patients with allograft function. Over the last 5 years, there has been emerging evidence that changes in the gastrointestinal microbiota following kidney transplantation may play a key role in the pathogenesis of transplant-associated infections. Different factors have emerged which may disrupt the interaction between the gastrointestinal microbiota and the immune system, which may lead to infective complications in kidney transplant recipients. Over the last 5 years, there has been emerging evidence that changes in the gastrointestinal microbiota following kidney transplantation may play a key role in the pathogenesis of transplant-associated infections. This review will discuss the structure and function of the gastrointestinal microbiota, the changes that occur in the gastrointestinal microbiota following kidney transplantation and the factors underpinning these changes, how these changes may lead to transplant-associated infectious complications and potential treatments which may be instituted to mitigate this risk.
Publisher: Wiley
Date: 05-2019
DOI: 10.1111/NEP.13556
Abstract: Haemodialysis is usually started at a frequency of three times a week, with occasional patients starting twice weekly ('incremental dialysis'). Incremental haemodialysis (HD) may preserve residual kidney function and has been associated with reduced mortality. In the present study, we report prevalence and outcomes of incremental dialysis in Australia and New Zealand. The cohort was all adults starting renal replacement therapy with HD in Australia and New Zealand 2004-2015. We used cox proportional hazards modelling with a primary exposure of dialysis frequency at first survey date (≥ or <3 times per week). The primary outcome was all-cause mortality (primary), cardiovascular and non-cardiovascular mortality (secondary). Eight-hundred fifty of 27 513 subjects were started on twice weekly HD (prevalence 3%). Compared to conventional patients, incremental dialysis patients were older (67 vs 62 years, P < 0.001), had a lower body mass index (26.1 vs 27.7 kg/m Incremental dialysis was used infrequently, and there was evidence of patient level differences. All-cause mortality was similar, but there were differences in cause specific mortality. Incremental dialysis needs to be tested in prospective trials to define the safety and efficacy of this approach.
Publisher: American Physiological Society
Date: 04-1997
DOI: 10.1152/AJPRENAL.1997.272.4.F484
Abstract: To determine whether insulin-like growth factor I (IGF-I) stimulated apical sodium/hydrogen exchange (NHE), confluent primary human proximal tubule cells (PTC) were incubated for 48 h in serum-free media in the presence or absence of 100 ng/ml IGF-I. Cells incubated in IGF-I demonstrated significant increases in thymidine incorporation (181.2 +/- 30.3% of control values n = 12, P = 0.01) and in resting intracellular pH (pHi) (7.52 +/- 0.08 vs. 7.30 +/- 0.06 n = 20, P 0.05), as determined by 2',7'-bis(carboxyethyl)-5(6)-carboxyfluorescein quantitative microspectrofluorometry. Following intracellular acid loading, ethylisopropylamiloride (EIPA)-inhibitable H+ efflux and 22Na+ influx after 1 min were both significantly enhanced in IGF-I-treated cells compared with controls (8.78 +/- 1.69 vs. 3.03 +/- 0.72 mM/min and 3.47 +/- 0.49 vs. 1.55 +/- 0.35 nmol x mg protein(-1) x min(-1), respectively). 22Na+ uptake studies in PTC grown on permeable supports demonstrated preferential stimulation of apical vs. basolateral NHE. The 50% inhibitory concentrations (IC50) in IGF-I-treated and control cells for EIPA (0.5 and 1.1 microM, respectively) and for HOE-694 (4.0 and 10.0 microM, respectively) were also consistent with predominant activation of apical, rather than basolateral, NHE activity. Kinetic analysis revealed an increase in maximal transport velocity (Vmax, 15.50 +/- 1.50 vs. 7.26 +/- 3.07 mM/min n = 10, P 0.05), without a significant change in antiporter affinity for extracellular Na+. Incubation of PTC with 100 ng/ml IGF-I produced an acute, reversible, and EIPA-inhibitable pHi increase of 0.05 +/- 0.01 pH units (n = 5, P 0.05). The results suggest that IGF-I may contribute to the metachronous stimulation of apical NHE and PTC growth observed in many physiological and pathological conditions involving the human kidney.
Publisher: SAGE Publications
Date: 07-2015
Abstract: The aim of the present study was to investigate the relationship between socio-economic status (SES) and peritoneal dialysis (PD)-related peritonitis. Associations between area SES and peritonitis risk and outcomes were examined in all non-indigenous patients who received PD in Australia between 1 October 2003 and 31 December 2010 (peritonitis outcomes). SES was assessed by deciles of postcode-based Australian Socio-Economic Indexes for Areas (SEIFA), including Index of Relative Socio-economic Disadvantage (IRSD), Index of Relative Socio-economic Advantage and Disadvantage (IRSAD), Index of Economic Resources (IER) and Index of Education and Occupation (IEO). 7,417 patients were included in the present study. Mixed-effects Poisson regression demonstrated that incident rate ratios for peritonitis were generally lower in the higher SEIFA-based deciles compared with the reference (decile 1), although the reductions were only statistically significant in some deciles (IRSAD deciles 2 and 4 – 9 IRSD deciles 4 – 6 IER deciles 4 and 6 IEO deciles 3 and 6). Mixed-effects logistic regression showed that lower probabilities of hospitalization were predicted by relatively higher SES, and lower probabilities of peritonitis-associated death were predicted by less SES disadvantage status and greater access to economic resources. No association was observed between SES and the risks of peritonitis cure, catheter removal and permanent hemodialysis (HD) transfer. In Australia, where there is universal free healthcare, higher SES was associated with lower risks of peritonitis-associated hospitalization and death, and a lower risk of peritonitis in some categories.
Publisher: Humana Press
Date: 03-10-2009
DOI: 10.1007/978-1-59745-352-3_2
Abstract: Primary cultures of renal proximal tubule cells (PTC) have been widely used to investigate tubule cell function. They provide a model system where confounding influences of renal haemodynamics, cell heterogeneity, and neural activity are eliminated. Additionally they are likely to more closely resemble PTC in vivo than established kidney cell lines, which are often virally immortalised and are of uncertain origin. This chapter describes a method used in our laboratories to isolate and culture pure populations of human PTC. The cortex is dissected away from the medulla and minced finely. Following collagenase digestion, the cells are passed through a sieve and separated on a Percoll density gradient. An almost pure population of tubule fragments form a band at the base of the gradient. Cultured in a hormonally defined serum-free growth media, they form a tightly packed monolayer that retains the differentiated characteristics of PTC for up to three passages.
Publisher: Springer Science and Business Media LLC
Date: 15-02-2011
Abstract: Observational studies have shown that asymptomatic hyperuricemia is associated with increased risks of hypertension, chronic kidney disease (CKD), end-stage renal disease, cardiovascular events, and mortality. Whether these factors represent cause, consequence or incidental associations, however, remains uncertain. Hyperuricemia could be a consequence of impaired kidney function, diuretic therapy or oxidative stress, such that elevated serum urate level represents a marker, rather than a cause, of CKD. On the other hand, small, short-term, single-center studies have shown improvements in blood-pressure control and slowing of CKD progression following serum urate lowering with allopurinol. An adequately powered randomized controlled trial is required to determine whether uric-acid-lowering therapy slows the progression of CKD. This article discusses the rationale for and the feasibility of such a trial. International collaboration is required to plan and conduct a large-scale multicenter trial in order to better inform clinical practice and public health policy about the optimal management of asymptomatic hyperuricemia in patients with CKD.
Publisher: Elsevier BV
Date: 06-2017
Publisher: Wiley
Date: 28-01-2016
DOI: 10.1111/NEP.12580
Abstract: In the last decade, chronic kidney disease (CKD), defined as reduced renal function (glomerular filtration rate (GFR) < 60 mL/min per 1.73 m(2) ) and/or evidence of kidney damage (typically manifested as albuminuria) for at least 3 months, has become one of the fastest-growing public health concerns worldwide. CKD is characterized by reduced clearance and increased serum accumulation of metabolic waste products (uremic retention solutes). At least 152 uremic retention solutes have been reported. This review focuses on indoxyl sulphate (IS), a protein-bound, tryptophan-derived metabolite that is generated by intestinal micro-organisms (microbiota). Animal studies have demonstrated an association between IS accumulation and increased fibrosis, and oxidative stress. This has been mirrored by in vitro studies, many of which report cytotoxic effects in kidney proximal tubular cells following IS exposure. Clinical studies have associated IS accumulation with deleterious effects, such as kidney functional decline and adverse cardiovascular events, although causality has not been conclusively established. The aims of this review are to: (i) establish factors associated with increased serum accumulation of IS (ii) report effects of IS accumulation in clinical studies (iii) critique the reported effects of IS in the kidney, when administered both in vivo and in vitro and (iv) summarize both established and hypothetical therapeutic options for reducing serum IS or antagonizing its reported downstream effects in the kidney.
Publisher: Public Library of Science (PLoS)
Date: 03-2017
Publisher: Public Library of Science (PLoS)
Date: 16-12-2014
Publisher: SAGE Publications
Date: 05-2017
Abstract: Preservation of residual renal function (RRF) is associated with improved survival. The aim of the present study was to identify independent predictors of RRF and urine volume (UV) in incident peritoneal dialysis (PD) patients. The study included incident PD patients who were balANZ trial participants. The primary and secondary outcomes were RRF and UV, respectively. Both outcomes were analyzed using mixed effects linear regression with demographic data in the first model and PD-related parameters included in a second model. The study included 161 patients (mean age 57.9 ± 14.1 years, 44% female, 33% diabetic, mean follow-up 19.5 ± 6.6 months). Residual renal function declined from 7.5 ± 2.9 mL/min/1.73 m 2 at baseline to 3.3 ± 2.8 mL/min/1.73 m 2 at 24 months. Better preservation of RRF was independently predicted by male gender, higher baseline RRF, higher time-varying systolic blood pressure (SBP), biocompatible (neutral pH, low glucose degradation product) PD solution, lower peritoneal ultrafiltration (UF) and lower dialysate glucose exposure. In particular, biocompatible solution resulted in 27% better RRF preservation. Each 1 L/day increase in UF was associated with 8% worse RRF preservation ( p = 0.007) and each 10 g/day increase in dialysate glucose exposure was associated with 4% worse RRF preservation ( p 0.001). Residual renal function was not independently predicted by body mass index, diabetes mellitus, renin angiotensin system inhibitors, peritoneal solute transport rate, or PD modality. Similar results were observed for UV. Common modifiable risk factors which were consistently associated with preserved RRF and residual UV were use of biocompatible PD solutions and achievement of higher SBP, lower peritoneal UF, and lower dialysate glucose exposure over time.
Publisher: Wiley
Date: 27-01-2011
DOI: 10.1111/J.1440-1797.2010.01407.X
Abstract: Randomized controlled trials have consistently demonstrated adverse outcomes from targeting higher haemoglobin levels in chronic kidney disease patients treated with erythropoiesis-stimulating agents (ESA). In contrast, observational studies have shown better survival in patients achieving high haemoglobin. Consequently, there is ongoing uncertainty as to whether high haemoglobin or high ESA dose contributes to poor outcomes in ESA-treated chronic kidney disease patients. The objectives of this article are to review the available evidence pertaining to this contentious area, provide recommendations where possible and suggest directions for future research efforts.
Publisher: Elsevier BV
Date: 09-2015
DOI: 10.1053/J.JRN.2015.01.017
Abstract: There is increasing clinical evidence that patients with chronic kidney disease (CKD) have a distinctly dysbiotic intestinal bacterial community, termed the gut microbiota, which in turn drives a cascade of metabolic abnormalities, including uremic toxin production, inflammation, and immunosuppression, that ultimately promotes progressive kidney failure and cardiovascular disease. As the gut microbiota is intimately influenced by diet, the discovery of the kidney-gut axis has created new therapeutic opportunities for nutritional intervention. This review discusses the metabolic pathways linking dysbiotic gut microbiota with adverse health outcomes in patients with CKD, as well as novel therapeutic strategies for targeting these pathways involving dietary protein, fiber, prebiotics, probiotics, and synbiotics. These emerging nutritional interventions may ultimately lead to a paradigm shift in the conventional focus of dietary management in CKD.
Publisher: Wiley
Date: 27-01-2014
Publisher: John Wiley & Sons, Ltd
Date: 15-04-2009
Publisher: Oxford University Press (OUP)
Date: 15-05-2020
DOI: 10.1093/NDT/GFZ076
Abstract: There is widespread recognition that research will be more impactful if it arises from partnerships between patients and researchers, but evidence on best practice for achieving this remains limited. We convened workshops in three Australian cities involving 105 patients/caregivers and 43 clinicians/researchers. In facilitated breakout groups, participants discussed principles and strategies for effective patient involvement in chronic kidney disease research. Transcripts were analysed thematically Five major themes emerged. ‘Respecting consumer expertise and commitment’ involved valuing unique and erse experiential knowledge, clarifying expectations and responsibilities, equipping for meaningful involvement and keeping patients ‘in the loop’. ‘Attuning to in idual context’ required a preference-based multipronged approach to engagement, reducing the burden of involvement and being sensitive to the patient journey. ‘Harnessing existing relationships and infrastructure’ meant partnering with trusted clinicians, increasing research exposure in clinical settings, mentoring patient to patient and extending reach through established networks. ‘Developing a coordinated approach’ enabled power in the collective and united voice, a systematic approach for equitable inclusion and streamlining access to opportunities and trustworthy information. ‘Fostering a patient-centred culture’ encompassed building a community, facilitating knowledge exchange and translation, empowering health ownership, providing an opportunity to give back and cultivating trust through transparency. Partnering with patients in research requires respect and recognition of their unique, erse and complementary experiential expertise. Establishing a supportive, respectful research culture, responding to their in idual context, coordinating existing infrastructure and centralizing the flow of information may facilitate patient involvement as active partners in research.
Publisher: Wiley
Date: 04-2006
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 28-02-2020
DOI: 10.2215/CJN.12341019
Abstract: The dietary self-management of CKD is challenging. Telehealth interventions may provide an effective delivery method to facilitate sustained dietary change. This pilot, randomized, controlled trial evaluated secondary and exploratory outcomes after a dietitian-led telehealth coaching intervention to improve diet quality in people with stage 3–4 CKD. The intervention group received phone calls every 2 weeks for 3 months (with concurrent, tailored text messages for 3 months), followed by 3 months of tailored text messages without telephone coaching, to encourage a diet consistent with CKD guidelines. The control group received usual care for 3 months, followed by nontailored, educational text messages for 3 months. Eighty participants (64% male), aged 62±12 years, were randomized to the intervention or control group. Telehealth coaching was safe, with no adverse events or changes to serum biochemistry at any time point. At 3 months, the telehealth intervention, compared with the control, had no detectable effect on overall diet quality on the Alternative Health Eating Index (3.2 points, 95% confidence interval, −1.3 to 7.7), nor at 6 months (0.5 points, 95% confidence interval, −4.6 to 5.5). There was no change in clinic BP at any time point in any group. There were significant improvements in several exploratory diet and clinical outcomes, including core food group consumption, vegetable servings, fiber intake, and body weight. Telehealth coaching was safe, but appeared to have no effect on the Alternative Healthy Eating Index or clinic BP. There were clinically significant changes in several exploratory diet and clinical outcomes, which require further investigation. Evaluation of In idualized Telehealth Intensive Coaching to Promote Healthy Eating and Lifestyle in CKD (ENTICE-CKD), ACTRN12616001212448.
Publisher: Elsevier BV
Date: 02-2020
DOI: 10.1016/J.BBRC.2019.11.119
Abstract: There is an increasing interest in studying the crosstalk between tumor-associated adipose tissue and tumor progression. In proximity to the primary site of kidney tumors, perinephric adipose tissue has direct contact with cancer cells when kidney cancer becomes invasive. To mimic the perinephric adipose tissue microenvironment, we applied the liquid overlay-based technique, which cost-effectively generated functional adipocyte spheroids using mesenchymal stem cells isolated from human perinephric adipose tissue. Thereafter, we co-cultured adipocyte spheroids with unpolarized macrophages and discovered an M2 phenotype skew in macrophages. Moreover, we discovered that, in the presence of adipocyte spheroids, M2 macrophages exhibited stronger invasive capacity than M1 macrophages. We further showed that the perinephric adipose tissue s led from metastatic kidney cancer exhibited high expression of M2 macrophages. In conclusion, the liquid overlay-based technique can generate a novel three-dimensional platform enabling investigation of the interactions of adipocytes and other types of cells in a tumor microenvironment.
Publisher: Wiley
Date: 22-05-2018
DOI: 10.1002/JSO.25037
Abstract: New-onset chronic kidney disease (CKD) following surgical management of kidney tumors is common. This study evaluated risk factors for new-onset CKD after nephrectomy for T1a renal cell carcinoma (RCC) in an Australian population-based cohort. There were 551 RCC patients from the Australian states of Queensland and Victoria included in this study. The primary outcome was new-onset CKD (eGFR 20 mm, radical nephrectomy, lower hospital caseloads (<20 cases/year), and rural place of residence. The associations between rural place of residence and low center volume were a consequence of higher radical nephrectomy rates. Risk factors for CKD after nephrectomy generally relate to worse baseline health, or likelihood of undergoing radical nephrectomy. Surgeons in rural centres and hospitals with low caseloads may benefit from formalized integration with specialist centers for continued professional development and case-conferencing, to assist in management decisions.
Publisher: SAGE Publications
Date: 17-01-2020
Abstract: Peritoneal dialysis (PD)-associated peritonitis carries significant morbidity, mortality, and is a leading cause of PD technique failure. This study aimed to assess the scope and variability of PD-associated peritonitis reported in randomized trials and observational studies. Cochrane Controlled Register of Trials, MEDLINE, and Embase were searched from 2007 to June 2018 for randomized trials and observational studies in adult and pediatric patients on PD that reported PD-associated peritonitis as a primary outcome or as a part of composite primary outcome. We assessed the peritonitis definitions used, characteristics of peritonitis, and outcome reporting and analysis. Seventy-seven studies were included, three were randomized trials. Thirty-eight (49%) of the included studies were registry-based observational studies. Twenty-nine percent ( n = 22) of the studies did not specify how PD-associated peritonitis was defined. Among those providing a definition of peritonitis, three components were reported: effluent cell count ( n = 42, 54%), clinical features consistent with peritonitis (e.g. abdominal pain and/or cloudy dialysis effluent) ( n = 35, 45%), and positive effluent culture ( n = 19, 25%). Of those components, 1 was required to make the diagnosis in 6 studies (8%), 2 out of 2 were required in 22 studies (29%), 2 out of 3 in 11 studies (14%), and 3 out of 3 in 4 studies (5%). Peritonitis characteristics and outcomes reported across studies included culture-negative peritonitis ( n = 47, 61%), refractory peritonitis ( n = 42, 55%), repeat peritonitis ( n = 9, 12%), relapsing peritonitis ( n = 5, 7%), concomitant exit site ( n = 16, 21%), and tunnel infections ( n = 8, 10%). Peritonitis-related hospitalization was reported in 38% of the studies ( n = 29), and peritonitis-related mortality was variably defined and reported in 55% of the studies ( n = 42). Peritonitis rate was most frequently reported as episodes per patient year ( n = 40, 52%). Large variability exists in the definitions, methods of reporting, and analysis of PD-associated peritonitis across trials and observational studies. Standardizing definitions for reporting of peritonitis and associated outcomes will better enable assessment of the comparative effect of interventions on peritonitis. This will facilitate continuous quality improvement measures through reliable benchmarking of this patient-important outcome across centers and countries.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 06-2022
DOI: 10.2215/CJN.16341221
Abstract: Quantifying contemporary peritoneal dialysis time on therapy is important for patients and providers. We describe time on peritoneal dialysis in the context of outcomes of hemodialysis transfer, death, and kidney transplantation on the basis of the multinational, observational Peritoneal Dialysis Outcomes and Practice Patterns Study (PDOPPS) from 2014 to 2017. Among 218 randomly selected peritoneal dialysis facilities (7121 patients) in the PDOPPS from Australia/New Zealand, Canada, Japan, Thailand, the United Kingdom, and the United States, we calculated the cumulative incidence from peritoneal dialysis start to hemodialysis transfer, death, or kidney transplantation over 5 years and adjusted hazard ratios for patient and facility factors associated with death and hemodialysis transfer. Median time on peritoneal dialysis ranged from 1.7 (interquartile range, 0.8–2.9 the United Kingdom) to 3.2 (interquartile range, 1.5–6.0 Japan) years and was longer with lower kidney transplantation rates (range: 32% [the United Kingdom] to 2% [Japan and Thailand] over 3 years). Adjusted hemodialysis transfer risk was lowest in Thailand, but death risk was higher in Thailand and the United States compared with most countries. Infection was the leading cause of hemodialysis transfer, with higher hemodialysis transfer risks seen in patients having psychiatric disorder history or elevated body mass index. The proportion of patients with total weekly Kt/V ≥1.7 at a facility was not associated with death or hemodialysis transfer. Countries in the PDOPPS with higher rates of kidney transplantation tended to have shorter median times on peritoneal dialysis. Identification of infection as a leading cause of hemodialysis transfer and patient and facility factors associated with the risk of hemodialysis transfer can facilitate interventions to reduce these events. This article contains a podcast at edia odcast/CJASN/2022_05_31_CJN16341221.mp3
Publisher: Wiley
Date: 08-1995
Publisher: Wiley
Date: 04-08-2020
Publisher: Informa UK Limited
Date: 2002
DOI: 10.1080/0897719021000006181
Abstract: The interrelationship between myofibroblasts and fibrogenic growth factors in the pathogenesis of renal fibrosis is poorly defined. A temporal and spatial analysis of myofibroblasts, their proliferation and death, and presence of transforming growth factor-beta1 (TGF-beta1) and platelet-derived growth factor-B (PDGF-B) was carried out in an established rodent model in which chronic renal scarring and fibrosis occurs after healed renal papillary necrosis (RPN), similar to that seen with analgesic nephropathy. Treated and control groups (N = 6 and 4, respectively) were compared at 2, 4, 8 and 12 weeks. A positive relationship was found between presence of tubulo-interstitial myofibroblasts and development of fibrosis. Apoptotic myofibroblasts were identified in the interstitium and their incidence peaked 2 weeks after treatment. Levels of interstitial cell apoptosis and fibrosis were negatively correlated over time (r = -0.57, p < 0.01), suggesting that as apoptosis progressively failed to limit myofibroblast numbers, fibrosis increased. In comparison with the diminishing apoptosis in the interstitium, the tubular epithelium had progressively increasing levels of apoptosis over time, indicative of developing atrophy of nephrons. TGF-beta1 protein expression had a close spatial and temporal association with fibrosis and myofibroblasts, whilst PDGF-B appeared to have a closer link with populations of other chronic inflammatory cells such as infiltrating lymphocytes. Peritubular myofibroblasts were often seen near apoptotic cells in the tubular epithelium, suggestive of a paracrine toxic effect of factor/s secreted by the myofibroblasts. In vitro, TGF-beta1 was found to be toxic to renal tubular epithelial cells. These findings suggest an interaction between myofibroblasts, their deletion by apoptosis, and the presence of the fibrogenic growth factor TGF-beta1 in renal fibrosis, whereby apoptotic deletion of myofibroblasts could act as a controlling factor in progression of fibrosis.
Publisher: Wiley
Date: 07-2011
Publisher: American College of Physicians
Date: 16-07-2019
DOI: 10.7326/M19-0087
Publisher: SAGE Publications
Date: 06-10-2022
Publisher: SAGE Publications
Date: 11-2011
Abstract: We analyzed data from the Australia and New Zealand Dialysis and Transplant Registry for 1 October 2003 to 31 December 2008 with the aim of describing the nature of peritonitis, therapies, and outcomes in patients on peritoneal dialysis (PD) in Australia. At least 1 episode of PD was observed in 6639 patients. The overall peritonitis rate was 0.60 episodes per patient–year (95% confidence interval: 0.59 to 0.62 episodes), with 6229 peritonitis episodes occurring in 3136 patients. Of those episodes, 13% were culture-negative, and 11% were polymicrobial. Gram-positive organisms were isolated in 53.4% of single-organism peritonitis episodes, and gram-negative organisms, in 23.6%. Mycobacterial and fungal peritonitis episodes were rare. Initial antibiotic therapy for most peritonitis episodes used 2 agents (most commonly vancomycin and an aminoglycoside) in 77.2% of episodes, therapy was subsequently changed to a single agent. Tenckhoff catheter removal was required in 20.4% of cases at a median of 6 days, and catheter removal was more common in fungal, mycobacterial, and anaerobic infections, with a median time to removal of 4 – 5 days. Peritonitis was the cause of death in 2.6% of patients. Transfer to hemodialysis and hospitalization were frequent outcomes of peritonitis. There was no relationship between center size and peritonitis rate. The peritonitis rate in Australia between 2003 and 2008 was higher than that reported in many other countries, with a particularly higher rate of gram-negative peritonitis.
Publisher: S. Karger AG
Date: 2021
DOI: 10.1159/000515231
Abstract: b i Introduction: /i /b Acute kidney diseases and disorders (AKD) encompass acute kidney injury (AKI) and subacute or persistent alterations in kidney function that occur after an initiating event. Unlike AKI, accurate estimates of the incidence and prognosis of AKD are not available and its clinical significance is uncertain. b i Methods: /i /b We studied the epidemiology and long-term outcome of AKD (as defined by the KDIGO criteria), with or without AKI, in a retrospective cohort of adults hospitalized at a single centre for & #x3e h between 2012 and 2016 who had a baseline eGFR ≥60 mL/min/1.73 m sup /sup and were alive at 30 days. In patients for whom follow-up data were available, the risks of major adverse kidney events (MAKEs), CKD, kidney failure, and death were examined by Cox and competing risk regression analyses. b i Results: /i /b Among 62,977 patients, 906 (1%) had AKD with AKI and 485 (1%) had AKD without AKI. Follow-up data were available for 36,118 patients. In this cohort, compared to no kidney disease, AKD with AKI was associated with a higher risk of MAKEs (40.25 per 100 person-years hazard ratio [HR] 2.51, 95% confidence interval [CI] 2.16–2.91), CKD (27.84 per 100 person-years) subhazard ratio [SHR] 3.18, 95% CI 2.60–3.89), kidney failure (0.56 per 100 person-years SHR 24.84, 95% CI 5.93–104.03), and death (14.86 per 100 person-years HR 1.52, 95% CI 1.20–1.92). Patients who had AKD without AKI also had a higher risk of MAKEs (36.21 per 100 person-years HR 2.26, 95% CI 1.89–2.70), CKD (22.94 per 100 person-years SHR 2.69, 95% CI 2.11–3.43), kidney failure (0.28 per 100 person-years SHR 12.63, 95% CI 1.48–107.64), and death (14.86 per 100 person-years HR 1.57, 95% CI 1.19–2.07). MAKEs after AKD were driven by CKD, especially in the first 3 months. b i Conclusions: /i /b These findings establish the burden and poor prognosis of AKD and support prioritisation of clinical initiatives and research strategies to mitigate such risk.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 04-2020
Abstract: Patients undergoing surgical management of kidney tumors are at increased risk of developing CKD. However, it is often difficult to identify patients at higher risk of clinically significant CKD before surgery, and there is a lack of validated tools to assist clinicians in this process. The authors developed and validated a simple scoring system that accurately and reproducibly stratifies risk of developing clinically significant CKD after nephrectomy on the basis of readily available parameters. This system provides an evidence-based quantitative tool for clinicians to balance the risk of CKD against other considerations when planning management of kidney tumors, and it will facilitate earlier identification of patients with a higher risk of developing clinically significant CKD, potentially leading to earlier intervention. Clinically significant CKD following surgery for kidney cancer is associated with increased morbidity and mortality, but identifying patients at increased CKD risk remains difficult. Simple methods to stratify risk of clinically significant CKD after nephrectomy are needed. To develop a tool for stratifying patients’ risk of CKD arising after surgery for kidney cancer, we tested models in a population-based cohort of 699 patients with kidney cancer in Queensland, Australia (2012–2013). We validated these models in a population-based cohort of 423 patients from Victoria, Australia, and in patient cohorts from single centers in Queensland, Scotland, and England. Eligible patients had two functioning kidneys and a preoperative eGFR ≥60 ml/min per 1.73 m 2 . The main outcome was incident eGFR ml/min per 1.73 m 2 at 12 months postnephrectomy. We used prespecified predictors—age ≥65 years old, diabetes mellitus, preoperative eGFR, and nephrectomy type (partial/radical)—to fit logistic regression models and grouped patients according to degree of risk of clinically significant CKD (negligible, low, moderate, or high risk). Absolute risks of stage 3b or higher CKD were %, 3% to 14%, 21% to 26%, and 46% to 69% across the four strata of negligible, low, moderate, and high risk, respectively. The negative predictive value of the negligible risk category was 98.9% for clinically significant CKD. The c statistic for this score ranged from 0.84 to 0.88 across derivation and validation cohorts. Our simple scoring system can reproducibly stratify postnephrectomy CKD risk on the basis of readily available parameters. This clinical tool’s quantitative assessment of CKD risk may be weighed against other considerations when planning management of kidney tumors and help inform shared decision making between clinicians and patients.
Publisher: Wiley
Date: 2007
DOI: 10.1002/RCM.2806
Abstract: Flow injection electrospray ionization tandem mass spectrometric methods for succinylacetone (SA) in 250 microL urine, using d5-SA as internal standard, and in 3 mm dried bloodspots, using 13C4-SA as internal standard, are described. Selectivity and sensitivity of analysis is achieved by the use of a mono-Girard T derivative. Measured SA infant urine normal range (n=20) is 0.013-0.27 micromol/mmol creatinine. Measured SA newborn bloodspot normal range (n=152) is 0-0.30 micromol/L. Bloodspots from children with hepatorenal tyrosinemia type 1, and kept at room temperature for up to 7 years, afforded SA concentrations of 0.9-5.7 micromol/L.
Publisher: Oxford University Press (OUP)
Date: 10-1995
Publisher: Elsevier BV
Date: 09-2019
Publisher: Wiley
Date: 23-12-2003
Publisher: Wiley
Date: 02-05-2019
DOI: 10.1111/NEP.13566
Abstract: Patients with autosomal dominant polycystic kidney disease (ADPKD) are at increased risk of premature mortality, morbidities and complications, which severely impair quality of life. However, patient-centered outcomes are not consistently reported in trials in ADPKD, which can limit shared decision-making. We aimed to identify outcomes important to patients and caregivers and the reasons for their priorities. Nominal group technique was adopted involving patients with ADPKD and caregivers who were purposively selected from eight centres across Australia, France and the Republic of Korea. Participants identified, ranked and discussed outcomes for trials in ADPKD. We calculated an importance score (0-1) for each outcome and conducted thematic analyses. Across 17 groups, 154 participants (121 patients, 33 caregivers) aged 19 to 78 (mean 54.5 years) identified 55 outcomes. The 10 highest ranked outcomes were: kidney function (importance score 0.36), end-stage kidney disease (0.32), survival (0.21), cyst size/growth (0.20), cyst pain/bleeding (0.18), blood pressure (0.17), ability to work (0.16), cerebral aneurysm/stroke (0.14), mobility hysical function (0.12), and fatigue (0.12). Three themes were identified: threatening semblance of normality, inability to control and making sense of erse risks. For patients with ADPKD and their caregivers, kidney function, delayed progression to end-stage kidney disease and survival were the highest priorities, and were focused on achieving normality, and maintaining control over health and lifestyle. Implementing these patient-important outcomes may improve the meaning and relevance of trials to inform clinical care in ADPKD.
Publisher: Oxford University Press (OUP)
Date: 29-09-2014
DOI: 10.1093/NDT/GFT401
Abstract: Oral disease may be increased in people with chronic kidney disease (CKD) and, due to associations with inflammation and malnutrition, represents a potential modifiable risk factor for cardiovascular disease and mortality. We summarized the prevalence of oral disease in adults with CKD and explored any association between oral disease and mortality. We used systematic review of observational studies evaluating oral health in adults with CKD identified in MEDLINE (through September 2012) without language restriction. We summarized prevalence and associations with all-cause and cardiovascular mortality using random-effects meta-analysis. We explored for sources of heterogeneity between studies using meta-regression. Eighty-eight studies in 125 populations comprising 11 340 adults were eligible. Edentulism affected one in five adults with CKD Stage 5D (dialysis) {20.6% [95% confidence interval (CI), 16.4-25.6]}. Periodontitis was more common in CKD Stage 5D [56.8% (CI, 39.3-72.8)] than less severe CKD [31.6% (CI, 19.0-47.6)], although data linking periodontitis with premature death were scant. One-quarter of patients with CKD Stage 5D reported never brushing their teeth [25.6% (CI, 10.2-51.1)] and a minority used dental floss [11.4% (CI, 6.2-19.8)] oral pain was reported by one-sixth [18.7% (CI, 8.8-35.4)], while half of patients experienced a dry mouth [48.4% (CI, 37.5-59.5)]. Data for kidney transplant recipients and CKD Stages 1-5 were limited. Oral disease is common in adults with CKD, potentially reflects low use of preventative dental services, and may be an important determinant of health in this clinical setting.
Publisher: Oxford University Press (OUP)
Date: 08-2012
DOI: 10.1093/NDT/GFS314
Publisher: BMJ
Date: 12-2015
Publisher: Oxford University Press (OUP)
Date: 06-10-2012
DOI: 10.1093/NDT/GFR582
Abstract: The role of seasonal variation in peritoneal dialysis (PD)-related peritonitis has been limited to a few small single-centre studies. Using all 6610 Australian patients receiving PD between 1 October 2003 and 31 December 2008, we evaluated the influence of seasons on peritonitis rates (Poisson regression) and outcomes (multivariable logistic regression). The overall rate of peritonitis was 0.59 episodes per patient-year of treatment. Using winter as the reference season, the peritonitis incidence rate ratios (95% confidence interval) for summer, autumn and spring were 1.02 (0.95-1.09), 1.01 (0.94-1.08) and 0.99 (0.92-1.06), respectively. Significant seasonal variations were observed in the rates of peritonitis caused by coagulase-negative Staphylococci (spring and summer peaks), corynebacteria (winter peak) and Gram-negative organisms (summer and autumn peaks). There were trends to seasonal variations in fungal peritonitis (summer and autumn peaks) and pseudomonas peritonitis (summer peak). No significant seasonal variations were observed for other organisms. Peritonitis outcomes did not significantly vary according to season. Seasonal variation has no appreciable influence on overall PD peritonitis rates or clinical outcomes. Nevertheless, significant seasonal variations were observed in the rates of peritonitis due to specific microorganisms, which may allow institutions to more precisely target infection control strategies prior to higher risk seasons.
Publisher: Springer Science and Business Media LLC
Date: 31-08-2021
Publisher: Elsevier BV
Date: 2010
DOI: 10.1053/J.AJKD.2009.08.020
Abstract: The study aim was to examine the frequency, predictors, treatment, and clinical outcomes of peritoneal dialysis-associated polymicrobial peritonitis. Observational cohort study using ANZDATA (The Australia and New Zealand Dialysis and Transplant Registry) data. All Australian peritoneal dialysis patients between October 2003 and December 2006. Age, sex, race, body mass index, baseline renal function, late referral, kidney disease, smoking status, comorbidity, peritoneal permeability, center, state, organisms, and antibiotic regimen. Polymicrobial peritonitis occurrence, relapse, hospitalization, catheter removal, hemodialysis transfer, and death. 359 episodes of polymicrobial peritonitis occurred in 324 in iduals, representing 10% of all peritonitis episodes during 6,002 patient-years. The organisms isolated included mixed Gram-positive and Gram-negative organisms (41%), pure Gram-negative organisms (22%), pure Gram-positive organisms (25%), and mixed bacteria and fungi (13%). There were no significant independent predictors of polymicrobial peritonitis except for the presence of chronic lung disease. Compared with single-organism infections, polymicrobial peritonitis was associated with higher rates of hospitalization (83% vs 68% P < 0.001), catheter removal (43% vs 19% P < 0.001), permanent hemodialysis transfer (38% vs 15% P 1 week after polymicrobial peritonitis onset were significantly more likely to be permanently transferred to hemodialysis therapy than those who had earlier catheter removal (92% vs 81% P = 0.05). Limited covariate adjustment. Residual confounding and coding bias could not be excluded. Polymicrobial peritonitis can be treated successfully using antibiotics alone without catheter removal in most cases, particularly when only Gram-positive organisms are isolated. Isolation of Gram-negative bacteria (with or without Gram-positive bacteria) or fungi carries a worse prognosis and generally should be treated with early catheter removal and appropriate antimicrobial therapy.
Publisher: CMA Joule Inc.
Date: 06-02-2012
DOI: 10.1503/CMAJ.111355
Publisher: SAGE Publications
Date: 20-09-2022
DOI: 10.1177/08968608221126849
Abstract: Gastrointestinal (GI) health is considered vital to the success of peritoneal dialysis (PD) and is critically important to patients, caregivers and clinicians. However, the multiplicity of GI outcome measures in trials undermines the ability to evaluate the frequency, impact and treatment of GI symptoms in patients receiving PD. Therefore, this study aimed to assess the range and consistency of GI outcomes reported in contemporary PD trials. Systematic review. In iduals with kidney failure requiring PD. All randomised controlled trials involving patients on PD, identified from the PUBMED, EMBASE and COCHRANE Central Registry of controlled Trials (CENTRAL) database, from January 2010 to July 2022. Any PD-related intervention. The frequency and characteristics of GI outcome measures were analysed and classified. Of the 324 eligible PD trials, GI outcomes were only reported in 61 (19%) trials, mostly as patient-reported outcomes (45 trials 74%). The most frequently reported outcomes were nausea in 27 (43%), diarrhoea in 26 (43%), vomiting in 22 (36%), constipation in 21 (34%) and abdominal pain in 19 (31%) of trials. PD peritonitis was the primary non-GI outcome reported in 24 (40%) trials, followed by death in 13 (21%) trials) and exit-site infection in 9 (15%) trials). Across all trials, 172 GI outcome measures were extracted and grouped into 29 different outcomes. Nausea and diarrhoea contributed to 16% and 15% of GI outcomes, respectively, while vomiting, constipation and abdominal pain contributed to 13%, 12% and 12%, respectively. Most (90%) GI outcomes were patient-reported adverse effects with no defined metrics. Faecal microbiome was reported as the primary study outcome in 3 (100%) trials using the subjective global assessment score, GI symptom rating scale and faecal microbiological and biochemical analysis. Two trials reported nausea as a primary study outcome using symptom assessment score (SAS) and kidney disease quality of life-short-form-36. One trial each reported anorexia and abdominal pain as the primary study outcome using SAS. Bowel habits, constipation and stool type were also reported as the primary study outcome in one trial each using the Bristol stool form scale. GI bleeding was reported as the secondary outcome in three (37%) out of eight trials reporting it. Restricted s ling frame to focus on contemporary trials. Despite the clinical importance of GI outcomes among patients on PD, they are reported in only 19% of PD trials, using inconsistent metrics, often as patient-reported adverse events. Efforts to standardise GI outcome reporting are critical to optimising comparability, reliability and value of trial evidence to improve outcomes for patients receiving PD.
Publisher: Wiley
Date: 06-02-2007
DOI: 10.1111/J.1440-1797.2006.00702.X
Abstract: The longevity of peritoneal dialysis (PD) is limited by technique failure and patient mortality. The authors assessed the influence of baseline and time-averaged fluid removal on patient, technique and death-censored technique survival. Peritoneal and total fluid removal was measured 1 month after commencing PD, then 6 monthly, in 225 incident patients (mean age 55.3+/-15.8 years, 52% male). A Cox proportional hazards model regression analysis was performed to identify variables independently predictive of technique and patient survival. Seventy (31.9%) patients were transferred to haemodialysis and 39 (17.63%) died. Technique survival was greatest in the middle tertile of baseline total fluid removal (mean survival time 3.5 vs 2.5 and 2.2 years for the lower and upper tertiles, respectively, log rank 6.5, P=0.039). The middle tertile of both baseline and time-averaged total fluid removal were significant predictors of PD survival (adjusted hazard ratio (HR) 0.476, 95% CI 0.286-0.795, P=0.005 relative to the upper tertile and HR 0.573, 95% CI 0.350-0.939, P=0.027 for baseline and time-averaged, respectively). Other significant variables on multivariate analysis were body mass index (HR 1.044 per kg/m2, 95% CI 1.005-1.084, P=0.028), creatinine (HR 0.999 per micromol, 95% CI 0.998-1.000, P=0.048) and residual Kt/V (HR 0.418, 95% CI 0.233-0.747, P=0.003). Patient survival was not affected by fluid removal. Patients with moderate total fluid removal both at baseline and throughout their PD career have improved technique survival. Attention should be paid to optimizing total fluid removal.
No related grants have been discovered for David Johnson.