ORCID Profile
0000-0002-3688-9353
Current Organisation
Royal Brisbane and Women's Hospital
Does something not look right? The information on this page has been harvested from data sources that may not be up to date. We continue to work with information providers to improve coverage and quality. To report an issue, use the Feedback Form.
Publisher: SAGE Publications
Date: 16-12-2021
Abstract: HIV-1 transmitted drug resistance (TDR) is associated with transmission in men who have sex with men (MSM), non MSM clusters, sexually transmitted infections (STIs) and can lead to antiretroviral failure. UK guidelines recommend performing TDR testing in all newly-diagnosed people living with HIV. We audited performance of TDR in our large tertiary HIV department from 2014–2020. All new patients had TDR testing attempted in the study period. The rate of TDR was 8% and was associated with increasing age and having non-B subtype. Having non-B subtype was not associated with being non-UK born. Thirty-four percent of people were diagnosed with a bacterial STI at the time of HIV diagnosis, but STI diagnosis was not associated with TDR. There was no significant change in TDR over the 6-year audit period. TDR remains a small but significant problem. Identifying these populations and providing effective HIV prevention interventions will reduce HIV incidence and TDR.
Publisher: Mark Allen Group
Date: 12-2006
DOI: 10.12968/HMED.2006.67.12.22443
Abstract: A 21-year-old woman was admitted to the authors' hospital with symptoms of breathlessness, pleuritic chest pain and haemoptysis having been unwell for 1 week with a flu-like illness. Clinical examination revealed a rapid respiratory rate (36/min), tachycardia (134 beats per minute), pyrexia (39°C), bibasal coarse crepitations and bronchial breathing. Investigations showed hypoxia (PO2 12.4 on 100% oxygen), raised inflammatory markers (C-reactive protein 324 mg/litre), thrombocytopenia (platelets 144 × 10 9 /litre), leucopenia (2 × 10 9 /litre) and neutropenia (0.88 × 10 9 /litre). Chest X-ray revealed bilateral air space shadowing (Figure 1). The patient was admitted to the high dependency unit and started on intravenous co-amoxiclav (1.2 g three times per day) and clarithromycin (500 mg twice daily). Blood cultures at 24 hours grew Staphylococcus aureus resistant to penicillin and fusidic acid. The patient was switched to intravenous flucloxacillin. Admission was complicated by the development of a small pleural effusion. The patient was managed with high flow oxygen therapy, did not require ventilatory support at any stage and was discharged home after 17 days. The patient received flucloxacillin for a total of 4 weeks. The S. aureus isolate carried enterotoxins G and I and the Panton-Valentine leucocidin gene. Throat swab was positive for influenza B by a molecular method.
Publisher: BMJ
Date: 08-02-2023
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 09-2002
Publisher: Elsevier BV
Date: 12-2020
Publisher: Springer Science and Business Media LLC
Date: 15-04-2021
DOI: 10.1186/S13561-021-00312-4
Abstract: Economic-evaluations using decision analytic models such as Markov-models (MM), and discrete-event-simulations (DES) are high value adds in allocating resources. The choice of modelling method is critical because an inappropriate model yields results that could lead to flawed decision making. The aim of this study was to compare cost-effectiveness when MM and DES were used to model results of transplanting a lower-quality kidney versus remaining waitlisted for a kidney. Cost-effectiveness was assessed using MM and DES. We used parametric survival models to estimate the time-dependent transition probabilities of MM and distribution of time-to-event in DES. MMs were simulated in 12 and 6 monthly cycles, out to five and 20-year time horizon. DES model output had a close fit to the actual data. Irrespective of the modelling method, the cycle length of MM or the time horizon, transplanting a low-quality kidney as compared to remaining waitlisted was the dominant strategy. However, there were discrepancies in costs, effectiveness and net monetary benefit (NMB) among different modelling methods. The incremental NMB of the MM in the 6-months cycle lengths was a closer fit to the incremental NMB of the DES. The gap in the fit of the two cycle lengths to DES output reduced as the time horizon increased. Different modelling methods were unlikely to influence the decision to accept a lower quality kidney transplant or remain waitlisted on dialysis. Both models produced similar results when time-dependant transition probabilities are used, most notable with shorter cycle lengths and longer time-horizons.
Publisher: Dustri-Verlgag Dr. Karl Feistle
Date: 02-2008
DOI: 10.5414/CNP69067
Abstract: A group of UK consultant transplant physicians and surgeons (the Consensus Group) met to consider the implications and interpretation of the National Institute for Clinical Excellence's (NICE) Technology Appraisal No. 85 on the use of immunosuppressive therapy for renal transplantation in adults. This group considered what the implications of these guidelines might be for clinical practice and consensus was developed on those areas which were potentially open to different interpretations. A wider survey of nephrologists and transplant surgeons throughout the UK was also performed to gauge the impact of the NICE recommendations. The outcome of the discussions of the Consensus Group are presented with particular reference to the recommendations of how to respond to calcineurin inhibitor (CNI) intolerance. The survey suggested that the publication of this NICE guidance has resulted in relatively few changes in prescribing practice: UK transplant centers continue to use a wide range of locally developed protocols for immunosuppressive therapy. These include the use of agents such as mycophenolate mofetil (MMF) and sirolimus, despite the fact that both drugs appeared to receive only conditional acceptance in the NICE Guidelines.
Publisher: Frontiers Media SA
Date: 06-1998
Publisher: Elsevier BV
Date: 06-1998
DOI: 10.1053/AJKD.1998.V31.PM9631841
Abstract: Incidence of end-stage renal disease in medically treated patients with severe bilateral atherosclerotic renovascular disease. Atherosclerotic renovascular disease is an important cause of end-stage renal disease (ESRD). The exact incidence of ESRD and the rate of decline in glomerular filtration rate (GFR) in patients with this condition is unknown. We report the mortality, the rate of decline in renal function, and incidence of ESRD in 51 patients with bilateral atherosclerotic renovascular disease followed-up for a median period of 52 months. None of these patients had undergone any surgical or radiological intervention. Renal function was determined by serial measurements of serum creatinine. Bilateral atherosclerotic renovascular disease was associated with a high mortality rate the crude mortality rate at 60 months was 45%. Assessment of renal function showed impaired renal function at time of angiography and a nonuniform and variable decline in renal function during the period of observation. The median GFR decreased from 39 mL/min (range, 15 to 80 mL/min) at time of angiography to 31 mL/min (range, 10 to 70 mL/min) and 24 mL/min (range, 10 to 40 mL/min) at 24 and 60 months, respectively (P < 0.05). The calculated mean rate of decline in GFR for all patients was 4 mL/min/yr (range, 1 to 16 mL/min/yr). Over the 5 years, there was a progressive increase in the incidence of ESRD. Of the original 51 patients who underwent angiography, six patients reached ESRD. The crude incidence of ESRD was, therefore, 12%. Patients who reached ESRD were characterized by advanced azotemia at the time of angiography (median GFR, 25 mL/min) and a rapid decline in GFR (8 mL/min) compared with patients who did not reach ESRD during the observation period (median GFR, 43 mL/min and an average rate of decline GFR of 3 mL/min).
Publisher: Springer Science and Business Media LLC
Date: 09-10-2020
DOI: 10.1186/S12913-020-05736-Y
Abstract: Matching survival of a donor kidney with that of the recipient (longevity matching), is used in some kidney allocation systems to maximize graft-life years. It is not part of the allocation algorithm for Australia. Given the growing evidence of survival benefit due to longevity matching based allocation algorithms, development of a similar kidney allocation system for Australia is currently underway. The aim of this research is to estimate the impact that changes to costs and health outcomes arising from ‘longevity matching’ on the Australian healthcare system. A decision analytic model to estimate cost-effectiveness was developed using a Markov process. Four plausible competing allocation options were compared to the current kidney allocation practice. Models were simulated in one-year cycles for a 20-year time horizon, with transitions through distinct health states relevant to the kidney recipient. Willingness to pay was considered as AUD 28000. Base case analysis indicated that allocating the worst 20% of Kidney Donor Risk Index (KDRI) donor kidneys to the worst 20% of estimated post-transplant survival (EPTS) recipients (option 2) and allocating the oldest 25% of donor kidneys to the oldest 25% of recipients are both cost saving and more effective compared to the current Australian allocation practice. Option 2, returned the lowest costs, greatest health benefits and largest gain to net monetary benefits (NMB). Allocating the best 20% of KDRI donor kidneys to the best 20% of EPTS recipients had the lowest expected incremental NMB. Of the four longevity-based kidney allocation practices considered, transplanting the lowest quality kidneys to the worst kidney recipients (option 2), was estimated to return the best value for money for the Australian health system.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 09-2018
Publisher: Springer Science and Business Media LLC
Date: 19-05-2020
DOI: 10.1186/S12962-020-00213-Z
Abstract: Health systems are under pressure to deliver more effective care without expansion of resources. This is particularly pertinent to diseases like chronic kidney disease (CKD) that are exacting substantial financial burden to many health systems. The aim of this study is to systematically review the Cost Utility Analysis (CUA) evidence generated across interventions for CKD patients undergoing kidney transplant (KT). A systemic review of CUA on the interventions for CKD patients undergoing KT was carried out using a search of the MEDLINE, CINAHL, EMBASE, PsycINFO and NHS-EED. The CHEERS checklist was used as a set of good practice criteria in determining the reporting quality of the economic evaluation. Quality of the data used to inform model parameters was determined using the modified hierarchies of data sources. A total of 330 articles identified, 16 met the inclusion criteria. Almost all (n = 15) the studies were from high income countries. Out of the 24 characteristics assessed in the CHEERS checklist, more than 80% of the selected studies reported 14 of the characteristics. Reporting of the CUA were characterized by lack of transparency of model assumptions, narrow economic perspective and incomplete assessment of the effect of uncertainty in the model parameters on the results. The data used for the economic model were satisfactory quality. The authors of 13 studies reported the intervention as cost saving and improving quality of life, whereas three studies were cost increasing and improving quality of life. In addition to the baseline analysis, sensitivity analysis was performed in all the evaluations except one. Transplanting certain high-risk donor kidneys (high risk of HIV and Hepatitis-C infected kidneys, HLA mismatched kidneys, high Kidney Donor Profile Index) and a payment to living donors, were found to be cost-effective. The quality of economic evaluations reviewed in this paper were assessed to be satisfactory. Implementation of these strategies will significantly impact current systems of KT and require a systematic implementation plan and coordinated efforts from relevant stakeholders.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 03-2010
Publisher: Wiley
Date: 02-02-2022
DOI: 10.1111/JDV.17961
Publisher: Frontiers Media SA
Date: 09-1996
Publisher: Wiley
Date: 22-09-2022
DOI: 10.1111/HIV.13406
Abstract: We aimed to evaluate the usability and acceptability of a co-designed mobile health (mHealth) application (PrEP-EmERGE) within a digital health pathway to support HIV pre-exposure prophylaxis (PrEP). This was a cross-sectional study to evaluate the usability and acceptability of the PrEP-EmERGE app. Data were collected via an online survey sent to all PrEP EmERGE users in September 2021. Usability was assessed with a validated usability tool, the System Usability Scale (SUS). Acceptability was assessed using modified patient-reported experience measures (PREMs). Quantitative data were analysed using descriptive and/or inferential statistics and qualitative data (free text responses) using thematic analysis. In total, 81/133 (61%) active PrEP EmERGE users completed the online survey, which was available directly from their PrEP EmERGE app: 78/81 (96%) identified as cis-male, 74/81 (91%) reported their ethnicity as 'white', 69/81 (85%) reported daily PrEP use, 7/81 (9%) reported using an event-based dosing schedule, and 5/81 (6%) were switching between dosing schedules. Overall, the median SUS score was 78/100 (interquartile range: 70-92). There were no differences in median SUS scores by PrEP dosing schedules (p = 0.78) or months of experience of using the app (p = 0.31). Overall, 73/81 (90%) would recommend the PrEP EmERGE app to a friend and 78/81 (96%) rated their satisfaction of the app as excellent, good or satisfactory. The free text responses generated three key themes: accessibility (for results and information) autonomy [taking responsibility for their (sexual) health] and functionality (including technical recommendations for app development and the digital health pathway). Innovative, co-designed digital health pathways, such as PrEP EmERGE can help sexual health services to manage increasing numbers of people accessing PrEP - ensuring that they retain access for those who need to be seen face-to-face. We report high levels of acceptability and usability during the first 4 months of this novel pathway.
Publisher: Elsevier BV
Date: 05-2009
Publisher: Oxford University Press (OUP)
Date: 10-2000
Publisher: Frontiers Media SA
Date: 09-1997
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 27-12-2006
Publisher: SAGE Publications
Date: 03-2011
Abstract: The burgeoning population of patients requiring renal replacement therapy contributes a disproportionate strain on National Health Service resources. Although renal transplantation is the preferred treatment modality for patients with established renal failure, achieving both clinical and financial advantages, limitations to organ donation and clinical comorbidities will leave a significant proportion of patients with established renal failure requiring expensive dialysis therapy in the form of either hemodialysis or peritoneal dialysis. An understanding of dialysis economics is essential for both healthcare providers and clinical leaders to establish clinically efficient and cost-effective treatment modalities that maximize service provision. In light of changes to the provision of healthcare funds in the form of “Payment by Results,” it is imperative for UK renal units to adopt clinically effective and financially accountable dialysis programs. This article explores the role of dialysis economics and implications for UK renal replacement therapy programs.
Publisher: Informa Healthcare
Date: 06-10-2005
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 15-02-2010
Publisher: Springer Science and Business Media LLC
Date: 15-11-2012
Abstract: New-onset diabetes mellitus after kidney transplantation (NODAT) is widely acknowledged to be associated with increased morbidity and mortality, as well as poor quality of life. Clear evidence links the occurrence of NODAT to accelerated progression of some macrovascular and/or microvascular complications. However, the evidence that some complications commonly attributed to diabetes mellitus occur in the context of transplantation lacks robustness. Certain complications are transplantation-specific and prevalent, but others are not frequently observed or documented. For this reason, it is essential that clinicians are aware of the array of potential complications associated with NODAT in kidney allograft recipients. Rather than simply translating evidence from the general population to the high-risk transplant recipient, this Review aims to provide specific guidance on diabetes-related complications in the context of a complex transplantation environment.
Publisher: Oxford University Press (OUP)
Date: 07-03-2008
DOI: 10.1093/NDT/GFM870
Abstract: The UK National Health Service (NHS) will fund renal services using Payment by Results (PbR), from 2009. Central to the success of PbR will be the creation of tariffs that reflect the true cost of medical services. We have therefore estimated the cost of different dialysis modalities in the Cardiff and Vale NHS Trust and six other hospitals in the UK. We used semi-structured interviews with nephrologists, head nurses and business managers to identify the steps involved in delivering the different dialysis modalities. We assigned costs to these using published figures or suppliers' published price lists. The study used mixed costing methods. Dialysis costs were estimated by a combination of microcosting and a top-down approach. Where we did not have access to detailed accounts, we applied values for Cardiff. The most efficient modalities were automated peritoneal dialysis (APD) and continuous ambulatory peritoneal dialysis (CAPD), the mean annual costs of which were pound21 655 and pound15 570, respectively. Hospital-based haemodialysis (HD) cost pound35 023 per annum and satellite-unit-based HD cost pound32 669. The cost of home-based HD was pound20 764 per year (based on data from only one unit). The main cost drivers for PD were the costs of solutions and management of anaemia. For HD they were costs of disposables, nursing, the overheads associated with running the unit and management of anaemia. Renal tariffs for PbR need to reflect the true cost of dialysis provision if choices about modalities are not to be influenced by erroneous estimates of cost. Knowledge of the true costs of modalities will also maximize the number of established renal failure patients treated by dialysis within the limited funds available from the NHS.
Publisher: F1000 Research Ltd
Date: 09-03-2020
DOI: 10.12688/F1000RESEARCH.20661.2
Abstract: Background: A mechanism to predict graft failure before the actual kidney transplantation occurs is crucial to clinical management of chronic kidney disease patients. Several kidney graft outcome prediction models, developed using machine learning methods, are available in the literature. However, most of those models used small datasets and none of the machine learning-based prediction models available in the medical literature modelled time-to-event (survival) information, but instead used the binary outcome of failure or not. The objective of this study is to develop two separate machine learning-based predictive models to predict graft failure following live and deceased donor kidney transplant, using time-to-event data in a large national dataset from Australia. Methods: The dataset provided by the Australia and New Zealand Dialysis and Transplant Registry will be used for the analysis. This retrospective dataset contains the cohort of patients who underwent a kidney transplant in Australia from January 1 st , 2007, to December 31 st , 2017. This included 3,758 live donor transplants and 7,365 deceased donor transplants. Three machine learning methods (survival tree, random survival forest and survival support vector machine) and one traditional regression method, Cox proportional regression, will be used to develop the two predictive models (for live donor and deceased donor transplants). The best predictive model will be selected based on the model’s performance. Discussion: This protocol describes the development of two separate machine learning-based predictive models to predict graft failure following live and deceased donor kidney transplant, using a large national dataset from Australia. Furthermore, these two models will be the most comprehensive kidney graft failure predictive models that have used survival data to model using machine learning techniques. Thus, these models are expected to provide valuable insight into the complex interactions between graft failure and donor and recipient characteristics.
Publisher: BMJ
Date: 27-08-2021
DOI: 10.1136/SEXTRANS-2021-055169
Abstract: Rates of HIV, syphilis and gonorrhoea have increased over the past 20 years in men who have sex with men (MSM). Contact tracing strategies have increased the number of MSM attending clinics as sexual contacts. Understanding the outcomes of contact tracing could inform future public health policies to reduce the burden of STIs in MSM. We aimed to describe the contribution of MSM attending as notified sexual contacts of patients with HIV, syphilis and gonorrhoea to the overall diagnoses of HIV, syphilis and gonorrhoea in MSM in a cross-sectional study. We collected data on all MSM diagnosed with HIV, syphilis and gonorrhoea in 2019 and evaluated which of these MSM were tested due to attending as a sexual contact. Sexual contacts of HIV, syphilis and gonorrhoea contributed to 20% (95% CI=17.3% to 23.7%) of all diagnoses of HIV (3 of 30, 10%), syphilis (28 of 183, 15%) or gonorrhoea (98 of 420, 23%) in the study period. Asymptomatic sexual contacts contributed to 12% (95% CI=9.6% to 14.9%) of all diagnoses of HIV (3 of 30, 10%), syphilis (16 of 183, 9%) and gonorrhoea (57 of 420, 14%). The proportion of MSM diagnosed with gonorrhoea attending as sexual contacts of gonorrhoea (21%) was significantly greater than MSM diagnosed with HIV, attending as sexual contacts of HIV (3%) or MSM diagnosed with syphilis, attending as a sexual contact of syphilis (4%) (p .001). Furthermore, the proportion of MSM diagnosed with syphilis, attending as a sexual contact of another STI (11%) was significantly greater than MSM diagnosed with HIV, attending as a contact of another STI (7%) or MSM diagnosed with gonorrhoea, attending as a sexual contact of another STI (2%) (p .001). Contact tracing contributes significantly to the overall diagnoses of HIV, syphilis and gonorrhoea including asymptomatic sexual contacts in our population. Further efforts to increase the yield from contact tracing may continue to reduce the burden of HIV, syphilis and gonorrhoea within sexual networks of MSM.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 05-1999
Publisher: Elsevier BV
Date: 11-2005
DOI: 10.1016/J.CLINTHERA.2005.11.002
Abstract: The purpose of this study was to evaluate the cost-effectiveness of sirolimus compared with cyclosporin for the postsurgical management of renal transplant recipients, from the perspective of the UK National Health Service and the Personal Social Service. A discrete event stochastic simulation model was developed to evaluate both cost-effectiveness and cost utility over 10 and 20 years after transplant using historical data on 937 renal transplant recipients from the University Hospital of Wales in Cardiff, United Kingdom. The simulation was designed to forecast the incidence of acute rejection events, graft failure, retransplant, frequency of hemodialysis (HD) and peritoneal dialysis (PD), and death. Cox proportional hazard models were derived from historical transplant data, in which 1-, 2-, and 3-year post-transplant serum creatinine levels were used as the key drivers for predicting graft success and survival. Costs were reported as year-2003 UK pounds sterling (1 UK pound = US $1.76). Probabilistic sensitivity analysis was conducted and results reported with particular attention to 2 threshold values, 30,000/QALY and 20,000/QALY RESULTS: Over a 10-year time horizon, treatment with sirolimus was projected to produce a gain of 0.60 discounted year of functioning graft with a cost savings of 276 UK pounds per patient. Over a 20-year time horizon these benefits increased to 1.59 discounted years of functioning graft and a cost savings of 7405 UK pounds per patient. Using sensitivity analysis of the 10-year model, the only factors found to cause the probability of exceeding a 30,000 ceiling to be >5% were the proportion of subjects maintaining continuous graft function and the use of low-dose cyclosporin. With the 20-year model, sirolimus maintained cost-effectiveness across most scenarios in sensitivity analysis. In this model analysis, sirolimus was cost-effective compared with cyclosporin for 10 to 20 years after renal transplantation in the United Kingdom, from the perspective of the UK National Health Service and Personal Social Service.
Publisher: Elsevier BV
Date: 09-1997
DOI: 10.1038/KI.1997.373
Abstract: To determine the effect of the ACE gene insertion/deletion (I/D) polymorphism, angiotensinogen gene M235T polymorphism and the angiotensin 1 receptor gene A1166C polymorphism on the age of onset of end-stage renal failure (ESRF) in PKD1 adult autosomal-dominant polycystic kidney disease (ADPKD), 189 in iduals from 46 families with PKD1 were genotyped for each polymorphism. Of the 189 patients 52 (28%) reached ESRF at an average age of 48 +/- 1 year. In patients genotyped for the ACE gene insertion/deletion polymorphism the frequencies of the DD, ID and II genotypes were similar to those expected from Hardy Weinberg equilibrium. In patients with ESRF there was an excess of patients homozygous for the deletion allele (DD: 48% chi2 = 9.97 (1df) P = 0.002). Cumulative renal survival was significantly reduced among those with DD genotype compared to ID and II genotypes. The estimated mean renal survival (95% confidence intervals) were: DD, 52 years [48, 57] II, 59 years [54, 63] ID, 64 years [56, 72] chi2 = 6.13 (1df) P = 0.013, DD versus ID/II. The mean age of renal failure was significantly younger in the DD genotype compared to ID and II genotypes (DD, ID, and II: 44 +/- 2, 49 +/- 2 and 54 +/- 3 years, respectively P < 0.05 DD vs. ID, P < 0.05 DD vs. II). Ten of the eleven patients who reached ESRF before the age of 40 were homozygous for the deletion allele. The relative risk for ESRF below the age 40 for DD genotype was 17. For all ages there was an overall increased risk of 1.4 for ESRF with the DD genotype. There was no interaction between age of onset of ESRF and either the angiotensinogen M235T allele or angiotensin 1 receptor A1166C polymorphism. This study strongly suggests that PKD 1 patients homozygous for the deletion allele of the ACE gene are at increased risk of developing ESRF at a early age.
Publisher: BMJ
Date: 02-08-2023
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 04-2003
Publisher: Frontiers Media SA
Date: 11-12-2013
DOI: 10.1111/TRI.12026
Abstract: New-onset diabetes mellitus (NODAT) is a serious complication following renal transplantation. In this cohort study, we studied 118 nondiabetic renal transplant recipients to examine whether indices of insulin resistance and secretion calculated before transplantation and at 3 months post-transplantation are associated with the development of NODAT within 1 year. We also analysed the long-term impact of early diagnosed NODAT. Insulin indices were calculated using homeostasis model assessment (HOMA) and McAuley's Index. NODAT was diagnosed using fasting plasma glucose. Median follow-up was 11 years. The cumulative incidence of NODAT at 1 year was 37%. By logistic regression, recipient age (per year) was the only significant pretransplant predictor of NODAT (OR 1.04, CI 1.009-1.072), while age (OR 1.04, CI 1.005-1.084) and impaired fasting glucose (OR 2.97, CI 1.009-8.733) were significant predictors at 3 months. Pretransplant and 3-month insulin resistance and secretion indices did not predict NODAT. All-cause mortality was significantly higher in recipients developing NODAT within 1 year compared with those remaining nondiabetic (44% vs. 22%, log-rank P = 0.008). By Cox's regression analysis, age (HR 1.075, CI 1.042-1.110), 1-year creatinine (HR 1.007, CI 1.004-1.010) and NODAT within 3 months (HR 2.4, CI 1.2-4.9) were independent predictors of death. In conclusion, NODAT developing early after renal transplantation was associated with poor long-term patient survival. Insulin indices calculated pretransplantation using HOMA and McAuley's Index did not predict NODAT.
Publisher: Elsevier BV
Date: 08-2014
DOI: 10.1016/J.BBABIO.2014.03.011
Abstract: The chromatophores of Rhodobacter (Rb.) sphaeroides represent a minimal bio-energetic system, which efficiently converts light energy into usable chemical energy. Despite extensive studies, several issues pertaining to the morphology and molecular architecture of this elemental energy conversion system remain controversial or unknown. To tackle these issues, we combined electron microscope tomography, immuno-electron microscopy and atomic force microscopy. We found that the intracellular Rb. sphaeroides chromatophores form a continuous reticulum rather than existing as discrete vesicles. We also found that the cytochrome bc1 complex localizes to fragile chromatophore regions, which most likely constitute the tubular structures that interconnect the vesicles in the reticulum. In contrast, the peripheral light-harvesting complex 2 (LH2) is preferentially hexagonally packed within the convex vesicular regions of the membrane network. Based on these observations, we propose that the bc1 complexes are in the inter-vesicular regions and surrounded by reaction center (RC) core complexes, which in turn are bounded by arrays of peripheral antenna complexes. This arrangement affords rapid cycling of electrons between the core and bc1 complexes while maintaining efficient excitation energy transfer from LH2 domains to the RCs.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 15-02-2008
Publisher: Elsevier BV
Date: 06-1998
DOI: 10.1016/S0041-1345(98)00228-0
Abstract: [This corrects the article DOI: 10.1002/ece3.9417.].
Publisher: SAGE Publications
Date: 20-06-2023
Publisher: SAGE Publications
Date: 23-05-2020
Abstract: We aim to identify associations that may help support strategies to increase job satisfaction and reduce unscheduled time off work for nurses. Given current concerns regarding nursing workforce and retention, it is vital we identify strategies and factors which maintain job satisfaction, support staff retention and reduce unscheduled time off work. As part of a quality improvement project, we conducted and distributed an online anonymous survey. Likert scales were used to measure job satisfaction, perceived quality of care, wellbeing, and unscheduled time off work. We explored participation in project work of any kind in the preceding 12 months, and captured nursing experience and current area of practice (inpatient/outpatient). A total of 350 complete responses were analysed. Nurses engaged in research or Quality Improvement Projects (QIPs) were more likely to have higher perceived levels of patient care (p = 0.0001), wellbeing (p = 0.0001) and job satisfaction (p = 0.0001) and reported lower levels of unscheduled time off work (p = 0.0001). Nurses engaged in research or quality improvement projects reported higher levels of job satisfaction, wellbeing, perceived higher levels of care in their workplace, and had lower levels of unscheduled time off work. We suggest that involving nurses in research/QIPs may improve workforce instability and job satisfaction.
Publisher: Springer Science and Business Media LLC
Date: 18-07-2022
Publisher: Elsevier BV
Date: 10-2019
DOI: 10.1016/J.IJMEDINF.2019.103957
Abstract: Machine learning has been increasingly used to develop predictive models to diagnose different disease conditions. The heterogeneity of the kidney transplant population makes predicting graft outcomes extremely challenging. Several kidney graft outcome prediction models have been developed using machine learning, and are available in the literature. However, a systematic review of machine learning based prediction methods applied to kidney transplant has not been done to date. The main aim of our study was to perform an in-depth systematic analysis of different machine learning methods used to predict graft outcomes among kidney transplant patients, and assess their usefulness as an aid to decision-making. A systemic review of machine learning methods used to predict graft outcomes among kidney transplant patients was carried out using a search of the Medline, the Cumulative Index to Nursing and Allied Health Literature, EMBASE, PsycINFO and Cochrane databases. A total of 295 articles were identified and extracted. Of these, 18 met the inclusion criteria. Most of the studies were published in the United States after 2010. The population size used to develop the models varied from 80 to 92,844, and the number of features in the models ranged from 6 to 71. The most common machine learning methods used were artificial neural networks, decision trees and Bayesian belief networks. Most of the machine learning based predictive models predicted graft failure with high sensitivity and specificity. Only one machine learning based prediction model had modelled time-to-event (survival) information. Seven studies compared the predictive performance of machine learning models with traditional regression methods and the performance of machine learning methods was found to be mixed, when compared with traditional regression methods. There was a wide variation in the size of the study population and the input variables used. However, the prediction accuracy provided mixed results when machine learning and traditional predictive methods are compared. Based on reported gains in predictive performance, machine learning has the potential to improve kidney transplant outcome prediction and aid medical decision making.
Publisher: Elsevier BV
Date: 06-1998
DOI: 10.1016/S0041-1345(98)00243-7
Abstract: There are more than 140 million annual visits to emergency departments (EDs) in the US. The role of EDs in providing care at or near the end of life is not well characterized. To determine the frequency of death in the ED or within 1 month of an ED visit in an all-age, all-payer national database. The retrospective cohort study used patient-level data from the nationally representative Optum clinical electronic health record data set for 2010 to 2020. Data were analyzed from January to March 2022. Age, Charlson Comorbidity Index (CCI), and year of ED encounter. The primary outcome was death in the ED, overall and stratified by age, CCI, or year. A key secondary outcome was death within 1 month of an ED encounter. We extrapolated to make national estimates using US Census and Nationwide Emergency Department S le data. Among a total of 104 113 518 in idual patients with 96 239 939 ED encounters, 205 372 ED deaths were identified in Optum, for whom median (IQR) age was 72 (53 to >80) years, 114 582 (55.8%) were male, and 152 672 (74.3%) were White. ED death affected 0.20% of overall patients and accounted for 0.21% of ED encounters. An additional 603 273 patients died within 1 month of an ED encounter. Extrapolated nationally, ED deaths accounted for 11.3% of total deaths from 2010 to 2019, and 33.2% of all decedents nationally visited the ED within 1 month of their death. The proportion of total national deaths occurring in the ED decreased by 0.27% annually (P for trend = .003) but the proportion who died within 1 month of an ED visit increased by 1.2% annually (P for trend < .001). Compared with all ED encounters, patients with visits resulting in death were older, more likely to be White, male, and not Hispanic, and had higher CCI. Among ED encounters for patients aged older than 80 years, nearly 1 in 12 died within 1 month. This retrospective cohort study found deaths during or shortly after ED care were common, especially among patients who are older and with chronic comorbidities. EDs must identify patients for whom end-of-life care is necessary or preferred and be equipped to deliver this care excellently.
Publisher: Research Square Platform LLC
Date: 04-03-2021
DOI: 10.21203/RS.3.RS-272583/V1
Abstract: Background Kidney graft failure risk prediction models assist evidence-based medical decision-making in clinical practice. Our objective was to develop and validate statistical and machine learning predictive models to predict death-censored graft failure following deceased donor kidney transplant, using time-to-event (survival) data in a large national dataset from Australia. Methods Data included donor and recipient characteristics (n=98) of 7,365 deceased donor transplants from January 1st, 2007 to December 31st, 2017 conducted in Australia. Seven variable selection methods were used to identify the most important independent variables included in the model. Predictive models were developed using: survival tree, random survival forest, survival support vector machine and Cox proportional regression. The models were trained using 70% of the data and validated using the rest of the data (30%). The model with best discriminatory power, assessed using concordance index (C-index) was chosen as the best model. Results Two models, developed using cox regression and random survival forest, had the highest C-index (0.67) in discriminating death-censored graft failure. The best fitting Cox model used seven independent variables and showed moderate level of prediction accuracy (calibration). Conclusion This index displays sufficient robustness to be used in pre-transplant decision making and may perform better than currently available tools.
Publisher: Elsevier BV
Date: 07-2019
Publisher: Elsevier BV
Date: 02-2001
Abstract: The aim of the current study was to characterize the effects of prolonged hyperglycemia on renal structure and function using a model of non-insulin-dependent diabetes mellitus: the Goto Kakizaki (GK) rat, which does not have confounding variables, such as hyperlipidemia, obesity, or elevated blood pressure. The data show that hyperglycemia in this model was not associated with the development of significant proteinuria, but it was associated with the development of definitive age-dependent renal structural changes. These changes consisted of thickening of glomerular basement membrane at 35 weeks and tubular basement membrane. This thickening was accompanied by marked glomerular hypertrophy resulting from a parallel increase in total capillary luminal volume and mesangial volume, but fractional capillary and mesangial volumes remained unchanged. There was evidence of podocyte injury, as assessed by de novo expression of desmin. In contrast, there was no evidence of mesangial cell activation, as assessed by their de novo expression of alpha-SMA. Interstitial monocyte/macrophage influx increased significantly in GK rats at 12 weeks compared with Wistar controls. Glomerular macrophage infiltration was elevated significantly in 35-week GK rats. The structural changes described in the GK rat are similar to those described in prolonged non-insulin-dependent diabetes mellitus patients who have not developed overt renal disease. This model allows us to investigate further the mechanisms involved in the pathogenesis of the consequences of prolonged hyperglycemia.
Publisher: BMJ
Date: 14-02-2012
DOI: 10.1136/EMERMED-2011-200101
Abstract: Various approaches have been used to identify possible routes for improvement of patient flow within an emergency unit (EU). One such approach is to use simulation to create a 'real world' model of an EU and carry out various tests to gauge ways of improvement. This paper proposes a novel approach in which simulation is used to create a 'perfect world model'. The EU at a major UK hospital is modelled not as it is, but as it could be. The 'efficiency gap' between the 'perfect world' and the 'real world' demonstrates how operational research can be used effectively to identify the location of bottlenecks in the current 'whole hospital' patient pathway and can be used in the planning and managing of hospital resources to ensure the most effective use of those resources.
Publisher: Wiley
Date: 30-04-2021
DOI: 10.1111/HIV.13112
Publisher: IEEE
Date: 03-2012
Publisher: Informa UK Limited
Date: 18-12-2014
DOI: 10.3111/13696998.2013.869227
Abstract: Secondary hyperparathyroidism (SHPT) is a major complication of end stage renal disease (ESRD). For the National Health Service (NHS) to make appropriate choices between medical and surgical management, it needs to understand the cost implications of each. A recent pilot study suggested that the current NHS healthcare resource group tariff for parathyroidectomy (PTX) (£2071 and £1859 in patients with and without complications, respectively) is not representative of the true costs of surgery in patients with SHPT. This study aims to provide an estimate of healthcare resources used to manage patients and estimate the cost of PTX in a UK tertiary care centre. Resource use was identified by combining data from the Proton renal database and routine hospital data for adults undergoing PTX for SHPT at the University Hospital of Wales, Cardiff, from 2000-2008. Data were supplemented by a questionnaire, completed by clinicians in six centres across the UK. Costs were obtained from NHS reference costs, British National Formulary and published literature. Costs were applied for the pre-surgical, surgical, peri-surgical, and post-surgical periods so as to calculate the total cost associated with PTX. One hundred and twenty-four patients (mean age=51.0 years) were identified in the database and 79 from the questionnaires. The main costs identified in the database were the surgical stay (mean=£4066, SD=£,130), the first month post-discharge (£465, SD=£176), and 3 months prior to surgery (£399, SD=£188) the average total cost was £4932 (SD=£4129). From the questionnaires the total cost was £5459 (SD=£943). It is possible that the study was limited due to missing data within the database, as well as the possibility of recall bias associated with the clinicians completing the questionnaires. This analysis suggests that the costs associated with PTX in SHPT exceed the current NHS tariffs for PTX. The cost implications associated with PTX need to be considered in the context of clinical assessment and decision-making, but healthcare policy and planning may warrant review in the light of these results.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 15-06-2010
Publisher: BMJ
Date: 22-06-2022
Publisher: Elsevier BV
Date: 06-2009
DOI: 10.1111/J.1600-6143.2009.02644.X
Abstract: Interventions to attenuate abnormal glycemia posttransplantation are required. In addition, surrogate markers of declining glycemic control are valuable. Statins may have pleiotropic properties that attenuate abnormal glucose metabolism. We hypothesized statins would improve glucose metabolism and HbA1c would be advantageous as a surrogate for worsening glycemia. We conducted a prospective, randomized, placebo controlled, crossover study in 20 nondiabetic renal transplant recipients at low risk for NODAT and compared effects of rosuvastatin on insulin secretion/sensitivity. Mathematical model analysis of an intravenous glucose tolerance test determined first-phase insulin secretion, insulin sensitivity and disposition index. Second-phase insulin secretion was determined with a meal tolerance test. Biochemical/clinical parameters were also assessed. Rosuvastatin significantly improved total cholesterol (-30%, p < 0.001), LDL cholesterol (-44%, p < 0.001) and triglycerides (-19%, p = 0.013). C-reactive protein decreased but failed to achieve statistical significance (-31%, p = 0.097). Rosuvastatin failed to influence any glycemic physiological parameter, although an inadequate timeframe to allow pleiotropic mechanisms to clinically manifest raises the possibility of a type II statistical error. On multivariate analysis, glycated hemoglobin (HbA1c) correlated with disposition index (R(2)= 0.201, p = 0.006), first-phase insulin secretion (R(2)= 0.106, p = 0.049) and insulin sensitivity (R(2)= 0.136, p = 0.029). Rosuvastatin fails to modify glucose metabolism in low-risk patients posttransplantation but HbA1c is a useful surrogate for declining glycemic control.
Publisher: SAGE Publications
Date: 30-09-2020
Abstract: The characteristics and serological responses of primary syphilis are not completely understood. We aimed to describe the characteristics, the serological responses and presumptive treatment of primary syphilis in HIV-positive and -negative men who have sex with men (MSM). We conducted a retrospective review of microbiological and demographic information from MSM presenting with primary syphilis. There were 111 cases of primary syphilis in MSM, the median age was 46 (IQR = 37–53years) and 40 (36%) were living with HIV. Fifty percent of MSM presented with painful lesions and 14% with extra-genital lesions. Extra-genital lesions were significantly more likely to be painful than non-genital lesions (OR 4.72 95%CI = 1.25–17.83, p = 0.02). Overall, a reactive serological response demonstrated a sensitivity of 80% (57/71) compared with Treponema pallidum PCR. Serology was more sensitive in MSM with no previous syphilis (OR = 3.38, 95%CI = 1.00–11.43, p 0.05). MSM presenting with painless lesions were more likely to be treated presumptively (OR = 3.39, 95%CI = 1.38–8.33, p 0.002). There were no differences in the characteristics, serological responses or management according to HIV status. Fifty percent of MSM with primary syphilis presented with painful lesions extra-genital lesions are more likely to be painful than genital lesions, serology is positive in 80% and there were no differences between HIV-positive and -negative MSM. Understanding the characteristics of primary syphilis will underpin public health c aigns.
Publisher: SAGE Publications
Date: 26-04-2023
Publisher: CSIRO Publishing
Date: 16-07-2021
DOI: 10.1071/SH20189
Abstract: Background Eleven percent of people living with HIV in Australia remain unaware of their diagnosis, and there are missed opportunities for HIV testing in priority settings in New South Wales. HIV testing remains low outside of sexual health clinics with the exception of antenatal settings where HIV testing is routine. To understand why HIV testing rates are low, we sought to identify health worker-related barriers to HIV testing. Methods: We conducted an anonymous online survey to health workers in Western Sydney Local Health District (WSLHD) in September 2019. Tick-box, Likert scale responses were analysed using Chi-square and Kruskal–Wallis statistical tests, and free text responses were analysed with thematic analysis. Results: Three percent (n = 420) of WSLHD’s estimated 14 000 health workers responded. These included 317 clinicians (171 nurses, 65 doctors, 56 allied health professionals (AHPs), 25 midwives, and 103 health workers in non-clinical roles). Health workers were from a variety of in-patient/out-patient settings. Many health workers (291/420, 69% 95%CI = 64.9–73.7%) were unaware that HIV testing is offered in their areas doctors (82%) and midwives (80%) were more aware than nurses (23%) and AHPs (11%) (P 0.0001). Doctors (Likert score = 3.62 3.45/5) and midwives (2.84 2.76) were significantly more comfortable discussing and confidently offering HIV testing than nurses (2.42 1.81) or AHPs (1.83 0.91) (P 0.0001 for both). The top five barriers to HIV testing were (1) procedural knowledge, (2) identification of at-risk patients, (3) HIV knowledge, (4) positive result management, and (5) privacy concerns. Free text responses highlighted perceived stigma, testing/result responsibilities and resource challenges as barriers to HIV testing. Conclusions: Clinicians working in priority settings and with priority populations require more education and support to increase targeted HIV testing.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 27-04-2014
Publisher: Elsevier BV
Date: 08-2007
DOI: 10.1016/J.JSB.2007.01.021
Abstract: Atomic force microscopy (AFM) has developed into a powerful tool in membrane biology. AFM features an outstanding signal-to-noise ratio that allows substructures on in idual macromolecules to be visualized. Most recently, AFM topographs have shown the supramolecular assembly of the bacterial photosynthetic complexes in native membranes. Here, we have determined the translational and rotational degrees of freedom of the complexes in AFM images of multi-protein assemblies, in order to build realistic atomic models of supramolecular assemblies by docking high-resolution structures into the topographs. Membrane protein assemblies of megadalton size comprising several hundreds of polypeptide chains and pigments were built with Angstrom precision.
Publisher: Elsevier BV
Date: 07-2003
DOI: 10.1034/J.1600-6143.2003.00133.X
Abstract: The TRansplant European Survey on Anemia Management (TRESAM) documented the prevalence and management of anemia in kidney transplant recipients. Data from 72 transplant centers in 16 countries were screened, involving 4263 patients who had received transplants 6 months, 1, 3 or 5 years earlier. The mean age of transplant recipients was 45.5 years at transplantation. The most common etiology was chronic glomerulonephritis. The most common comorbidities were coronary artery disease, hepatitis B/C, and type 2 diabetes. The mean hemoglobin levels before transplantation were significantly higher in the more recently transplanted recipients. At enrollment, 38.6% of patients were found to be anemic. Of the 8.5% of patients who were considered severely anemic, only 17.8% were treated with epoetin. There was a strong association between hemoglobin and graft function of the 904 patients with serum creatinine > 2 mg/dL, 60.1% were anemic, vs. 29.0% of those with serum creatinine <or= 2 mg/dL (p < 0.01). Therapy with angiotensin-converting enzyme (ACE) inhibitors, angiotensin II receptor antagonists, mycophenolate mofetil (MMF) or azathioprine was also associated with a higher likelihood of anemia. The prevalence of anemia in the transplant recipients was remarkably high and appeared to be associated with impaired renal function and with ACE inhibitors and angiotensin II receptor antagonist use. Further studies should be carried out to interpret whether appropriate management of anemia after kidney transplantation may improve long-term outcome.
Publisher: American Physiological Society
Date: 09-1995
DOI: 10.1152/AJPRENAL.1995.269.3.F331
Abstract: Renal function was assessed at 2 and 8 wk after infusion of puromycin into the left renal artery of Munich Wistar rats. At 2 wk, albumin excretion averaged 90 +/- 12 micrograms/min in the left kidney and 4 +/- 1 microgram/min in the right kidney. Unilateral nephrosis was accompanied by reduction in the glomerular filtration rate (GFR) (left, 0.71 +/- 0.04 right, 1.31 +/- 0.02 ml/min) and by impaired excretion of sodium (FENa left, 0.025 +/- 0.004 right, 0.064 +/- 0.006%). Reductions in GFR and FENa in the nephrotic kidney were not reversed by acute angiotensin II receptor blockade with losartan. At 8 wk, albumin excretion averaged 6 +/- 1 in the left kidney and 8 +/- 1 microgram/min in the right kidney. Recovery from nephrosis was accompanied by persistent reduction in GFR (left, 1.05 +/- 0.05 right, 1.41 +/- 0.05 ml/min) and impairment of sodium excretion in the previously nephrotic left kidney (left, 0.031 +/- 0.004 right, 0.051 +/- 0.004%). Losartan again did not return GFR and FENa toward normal. The reductions in GFR and FENa in the previously nephrotic left kidney were associated with structural changes, including intratubular casts, an increased fractional volume of the interstitium (left, 25 +/- 1 right, 15 +/- 1%), decreased fractional volume of tubules (left, 66 +/- 2 right, 77 +/- 1%), and glomerular collapse (left, 15 +/- 2 right, 1 +/- 1%). These findings suggest that tubulointerstitial injury can cause persistent reduction in GFR and impairment of sodium excretion after recovery from acute nephrosis.
Publisher: Elsevier BV
Date: 10-1994
DOI: 10.1038/KI.1994.357
Abstract: The effect of acute Ang II blockade on renal function in rats with reduced nephron number was assessed in micropuncture studies. The Ang II receptor blocker, losartan, was administered at a dose of 10 mg i.v. at two intervals following five-sixths renal ablation. At eight weeks following ablation, Ang II blockade (Ang IIX) increased sodium excretion [UNa V, Ang IIX 2.2 +/- 0.4 microEq/min time control (TC) 1.0 +/- 0.3 microEq/min P < 0.05] but did not reduce mean arterial pressure (AP, Ang IIX 142 +/- 6 mm Hg TC 151 +/- 6 mmHg), glomerular transcapillary pressure (delta P, Ang IIX 50 +/- 1 mm Hg TC 50 +/- 1 mm Hg), or urine albumin excretion (UAlb V: Ang IIX 149 +/- 18 micrograms/min TC 168 +/- 20 micrograms/min). Similarly, at two weeks following ablation, Ang II blockade increased UNa V (Ang IIX 2.8 +/- 0.4 microEq/min TC 0.5 +/- 0.2 microEq/min P < 0.05) without reducing AP (Ang IIX 132 +/- 6 mm Hg TC 140 +/- 7 mm Hg), delta P (Ang IIX 50 +/- 3 mm Hg TC 48 +/- 2 mm Hg), or UAlb V (Ang IIX 32 +/- 3 micrograms/min TC 36 +/- 10 micrograms/min). These findings indicate that within the remant kidney, Ang II promotes sodium retention but does not have an acutely reversible effect on glomerular pressure or permselectivity.
Publisher: Springer Science and Business Media LLC
Date: 2012
Publisher: Wiley
Date: 22-08-2021
DOI: 10.1111/JDV.17589
Publisher: Yonsei University College of Medicine
Date: 2004
Publisher: Wiley
Date: 07-10-2023
DOI: 10.1111/JDV.18627
Publisher: Wiley
Date: 07-2001
DOI: 10.1034/J.1600-051X.2001.028007706.X
Abstract: Severe gingival hyperplasia (GH) is one of the most frequent side-effects associated with the prescription of Cyclosporine-A (CsA). This study statistically modeled the medical and dental risk factors for the development of GH following CsA administration to determine whether renal function post-transplantation was related to the incidence or extent of GH in 236 consecutive renal transplant patients. All patients were at least 6 months post-transplant and medicated with both traditional oral CsA (n=220 in iduals) and the new microemulsion form CsA-Me (n=229 in iduals). Patients had either received CsA alone (n=45 in iduals) or cyclosporine and nifedipine (n=191 in iduals). Gingival overgrowth was assessed and computerized data, available for all patients included pre- and post-transplant medical history and post-transplant renal function, i.e., serum creatinine levels, documented rejection episodes and glomerular filtration rates (GFR). These data together with CsA serum levels and last-recorded dose of CsA, CsA-Me, nifedipine, azathioprine and prednisolone, were analysed by multivariate regression analysis using SPSS. The extent and severity of hyperplasia was significantly correlated with the dosage and serum level of CsA at 3, 6 and 12 months post-transplantation last recorded dosage, however (p<0.0001), was the most accurate predictor of hyperplasia. Gingivitis (p<0.0001) and plaque (p<0.002), were associated with hyperplasia. Duration of renal replacement therapy, age at transplantation, post-transplant interval serum creatinine levels and documented rejection episodes were unrelated with the extent and severity of GH. Of all the renal variables only the correlation of GFR with last recorded doses of CsA and CsA-Me, approached significance this was then considered for inclusion in the model. In a multiple regression analysis including GFR, however, only last CsA (and CsA-Me) doses and gingivitis score were selected for inclusion in the final model. These data demonstrate that inter-patient variation in the extent and severity of GH and renal function post-transplantation are unrelated and are mediated independently.
Publisher: Frontiers Media SA
Date: 06-1998
Publisher: Elsevier BV
Date: 05-2012
Publisher: SAGE Publications
Date: 18-12-2021
Abstract: Within the UK, the majority of hepatitis A occurs in high risk groups such as men who have sex with men (MSM). It has been estimated that 70% of MSM need immunity to provide adequate herd immunity. We aimed to estimate the proportion of hepatitis A susceptibility in MSM throughout a 10-year period (2010–2019), and explore associated demographic factors. Using our Electronic Patient Record system, we extracted anonymous clinical data between for MSM at their first attendance including hepatitis A IgG result, age, country of birth and diagnosis of an STI. Overall, 1401/6884(20%) were tested for hepatitis A IgG at their first attendance, with 626/1401 (45%, 95% CI = 42%–47%) showing susceptibility. Testing rates increased between 2010–2019 (OR = 67.79, 95%CI = 39.09–117.60, p = .0001) however, susceptibility remained similar (OR = 0.98, 95%CI = 0.33–2.89, p = 0.98). MSM aged 35 and under had significantly higher susceptibility vs MSM aged over 35 (OR 3.4176, 95%CI = 2.71–4.31, p = .0001). UK-born had significantly higher susceptibility vs non-UK born (OR 1.5, 95%CI = 1.2147–1.8618, p = 0.0002). Susceptibility of hepatitis A in MSM may be higher than necessary to control future outbreaks. It is important that effective targeting of MSM, particularly young MSM, occur at all levels of healthcare and not solely rely on opportunistic presentation at a sexual health clinic.
Publisher: Elsevier BV
Date: 02-2002
DOI: 10.1046/J.1523-1755.2002.00149.X
Abstract: Chronic allograft nephropathy is an important cause of graft failure. Many donor and recipient factors contribute to its development. Prospective analysis of these factors has been hindered by the lack of sensitive and specific indicators of renal injury. As a consequence protocol biopsies have been increasingly used in the assessment of renal allograft injury. We performed protocol renal allograft biopsies to prospectively examine the role of important determinants and mediators of chronic allograft nephropathy. A total of 51 consecutive cadaveric renal transplant recipients entered a randomized prospective study of tacrolimus (Tac) versus cyclosporine (CsA) microemulsion based immunosuppression. Study patients underwent protocol renal allograft biopsies at the time of engraftment and at 3, 6 and 12 months post-transplantation. Biopsies were analyzed by quantitative polymerase chain reaction (PCR) for mRNA for transforming growth factor-beta (TGF-beta), thrombospondin, and fibronectin. Measurements of renal structural injury were estimated by quantitative assessment of interstitial fibrosis and glomerulosclerosis. Changes in profibrotic growth factors and renal structural injury were related to donor and recipient determinants by stepwise regression analysis. Longitudinal assessment of renal injury demonstrated an early and progressive increase in mRNA for TGF-beta, thrombospondin (TSP) and fibronectin (FBN): TGF-beta baseline, 1.9 +/- 0.2 log copies TGF-beta 6 months, 2.5 +/- 0.2 log copies, P < 0.05 6 months vs. baseline TSP baseline, 1.9 +/- 0.2 log copies TSP 6 months, 2.4 +/- 0.2 log copies, P < 0.05 6 months vs. baseline FBN baseline, 2.0 +/- 0.2 log copies FBN 12 months, 2.3 +/- 0.2 log copies, P < 0.05 12 months vs. baseline. This increase in profibrotic growth factors within the allograft was associated with a significant increase in interstitial fibrosis (Vvi) on renal biopsies: Vvi baseline, 13 +/- 1% Vvi 3 months, 18 +/- 1% Vvi 6 months, 28 +/- 2% Vvi 12 months, 34 +/- 2% P < 0.05 3, 6, and 12 months vs. baseline. Histological analysis demonstrated chronic allograft nephropathy in 4% biopsies at 3 months, 12% at 6 months and in 49% at 12 months. These changes in renal structure were not associated with any change in creatinine clearance (CCr): CCr 3 months, 56 +/- 2 mL/min, CCr 24 months, 56 +/- 2 mL/min P=NS. Stepwise regression analysis of key donor and recipient determinants of chronic renal injury identified calcineurin inhibitors and acute rejection episodes as important factors involved in the development of chronic renal injury. In particular, the use of cyclosporine compared to tacrolimus was associated with a tenfold increase in TGF-beta mRNA (TGF-beta mRNA at 6 months, CsA vs. Tac, 3 +/- 0.3 vs. 2 +/- 0.3 log copies, P < 0.05), interstitial fibrosis (Vvi at 6 months, CsA vs. Tac, 33 +/- 4% vs. 24 +/- 2%, P < 0.05). Changes in growth factors and renal structure predicted impaired renal function (CCr at 12 months, CsA vs. Tac, 53 +/- 4 mL/min vs. 62 +/- 2 mL/min, P < 0.05). Similarly, acute rejection episodes were associated with an accelerated development of interstitial fibrosis (Vvi at 6 months, acute rejection vs. no rejection, 34 +/- 3% vs. 25 +/- 2% P < 0.05), but not with changes in TGF-beta, thrombospondin or fibronectin expression. Our results suggest that structural injury develops early in the natural history of the renal allograft and is mediated, in part, by the early up-regulation of profibrotic growth factors. We have determined that calcineurin inhibitors, in particular cyclosporine, and acute rejection episodes are key factors in the development of renal structural injury.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 05-1999
Publisher: Springer Science and Business Media LLC
Date: 21-06-2021
DOI: 10.1186/S12874-021-01319-5
Abstract: Kidney graft failure risk prediction models assist evidence-based medical decision-making in clinical practice. Our objective was to develop and validate statistical and machine learning predictive models to predict death-censored graft failure following deceased donor kidney transplant, using time-to-event (survival) data in a large national dataset from Australia. Data included donor and recipient characteristics (n = 98) of 7,365 deceased donor transplants from January 1st, 2007 to December 31st, 2017 conducted in Australia. Seven variable selection methods were used to identify the most important independent variables included in the model. Predictive models were developed using: survival tree, random survival forest, survival support vector machine and Cox proportional regression. The models were trained using 70% of the data and validated using the rest of the data (30%). The model with best discriminatory power, assessed using concordance index (C-index) was chosen as the best model. Two models, developed using cox regression and random survival forest, had the highest C-index (0.67) in discriminating death-censored graft failure. The best fitting Cox model used seven independent variables and showed moderate level of prediction accuracy (calibration). This index displays sufficient robustness to be used in pre-transplant decision making and may perform better than currently available tools.
Publisher: Wiley
Date: 10-2021
DOI: 10.1111/IMJ.15516
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 05-1999
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 05-1999
Publisher: BMJ
Date: 13-10-2023
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 15-02-2010
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 27-07-2009
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 27-06-2009
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 05-1999
Publisher: BMJ
Date: 19-04-2022
DOI: 10.1136/SEXTRANS-2020-054878
Abstract: There has been a significant increase in syphilis in men who have sex with men (MSM) in the UK over the past 20 years. Partner notification strategies have increased the number of MSM attending STI clinics as sexual contacts of syphilis. Current guidelines suggest testing and consideration of presumptive antimicrobial treatment. Syphilis treatment with benzathine penicillin requires clinic resources, is painful and is associated with complications. It is important we consider strategies to rationalise presumptive antimicrobial use and promote antimicrobial stewardship. We aimed to determine if there are any factors associated with having syphilis among MSM attending as sexual contacts of syphilis in a cross-sectional study. We examined the clinical records of MSM attending as sexual contacts of syphilis from January to December 2019. Of the 6613 MSM who attended for STI testing, 142 of 6613 (2.1%) presented as sexual contacts of syphilis. The median age was 40 years (IQR=31–51), 43 of 142 (30%) were HIV positive, 38 of 142 (27%) had been diagnosed and treated for syphilis in the past, and 11 of 142 (8%) presented with symptoms (possible lesions of primary or secondary syphilis). Thirteen (9%, 95% CI=4.4 to 13.9) tested positive for syphilis on the day of presentation. MSM who were symptomatic (genital ulcer or body rash), HIV positive or had a history of syphilis were significantly more likely to test positive for syphilis (OR=51.88, 95% CI: 3.01 to 893.14, p=0.007). We have shown that in our clinic-based population of MSM who presented as sexual contacts of syphilis, the factors associated with testing positive for syphilis were: having HIV, having a history of syphilis or presenting with symptoms (possible lesions of primary or secondary syphilis). These factors could be used to rationalise antibiotic treatment among MSM presenting as sexual contacts of syphilis. Further research is needed to validate this finding in other populations of MSM and people affected by syphilis.
Publisher: Springer Science and Business Media LLC
Date: 25-05-2010
Abstract: New-onset diabetes after transplantation, a common complication following kidney transplantation, is associated with adverse patient and graft outcomes. Our understanding of the risk factors associated with this metabolic disorder is improving and both transplantation-specific and nonspecific factors are clearly involved. Knowledge of these risk factors is important so that clinicians can implement pre-emptive risk stratification strategies and to guide therapeutic, risk-attenuation approaches in patients who develop transplant-associated hyperglycemia. In this Review, we explore the current understanding of the erse range of risk factors that contribute to abnormal glucose metabolism after transplantation, with the aim of helping to guide clinical decision-making using appropriate risk stratification.
Publisher: Frontiers Media SA
Date: 23-08-1999
Abstract: To evaluate the role of tacrolimus in the treatment of Chronic Graft Nephropathy (CGN), a pilot cross-sectional study was performed on 14 patients with deteriorating renal function and biopsy-proven CGN. Maintenance therapy was switched from cyclosporin to tacrolimus, and results of conversion on allograft function were assessed by estimated glomerular filtration rate (GFR) and clinical outcome. Minimum follow-up was 15 months. Two distinctive response patterns emerged: (i) continuing deterioration of renal function with no apparent benefit over the projected trend of GFR (nine patients), and (ii) unequivocal change in the GFR trend line equation with reduced rate of deterioration in one patient and sustained improvement of GFR in four patients (reversal of downward trend). Five out of 14 patients (36 %) benefited from replacing Neoral with Prograf. All five patients exceeded their estimated time of return to dialysis by a median of 41 weeks (range: 29-52) and their grafts continue to function.
Publisher: Wiley
Date: 24-07-2021
DOI: 10.1111/JDV.17503
Publisher: Wiley
Date: 07-09-2022
DOI: 10.1111/J.1399-0012.2006.00565.X
Abstract: The clinical impact of new-onset diabetes mellitus (NODM) is frequently underestimated by clinicians. NODM occurs in approximately 15-20% of renal transplant patients and 15% of liver transplant recipients. Diabetes after transplantation is a leading risk factor for cardiovascular events, with a higher prognostic value than in the non-transplant population. NODM also appears to have a negative influence on graft function, and graft survival rates after renal transplantation are significantly lower in patients who develop diabetes than in controls. Patient mortality following renal transplantation is generally found to be higher in patients with NODM, due to increased cardiovascular and peripheral vascular disease, accelerated graft deterioration and diabetes-related complications, notably infection. A renal registry analysis has reported an increase of 87% in risk of death following onset of NODM. There is also limited evidence that NODM is associated with increased risk of death in liver transplant patients. The relative incidence and severity of diabetic complications in transplant recipients have not been assessed rigorously in a clinical trial but registry data indicate that 20% of renal transplant patients with NODM experience at least one clinically significant diabetic complication within three years. Financially, the additional healthcare costs incurred over the first two years following onset of NODM amount to 21,500 dollars. Routine pre-transplant assessment of diabetic risk, with requisite modification of lifestyle, glycaemic monitoring and immunosuppressive regimens, and coupled with standardized, aggressive hypoglycaemic management as necessary, offers an important opportunity to alleviate the burden of NODM for transplant patients.
Publisher: Springer Science and Business Media LLC
Date: 2006
DOI: 10.2165/00019053-200624010-00006
Abstract: Immunosuppressive therapy is required to prevent graft rejection. Calcineurin inhibitors such as tacrolimus are paradoxically toxic to the kidney, whereas sirolimus (rapamycin Rapamune) is not generally associated with the nephrotoxicity of CNIs. The purpose of this study was to evaluate the relative cost utility of sirolimus versus tacrolimus for the primary prevention of graft rejection in renal transplant recipients in the UK. A stochastic simulation model was constructed using clinical trial and observational data comparing the two treatments. Time duration was up to 20 years. Costs were from a UK NHS perspective, valued at 2003 prices and discounted at 6%. Benefits were discounted at 1.5%. Simulated events included patient and graft survival, haemodialysis, peritoneal dialysis, re-transplants and acute rejection. Costs were summed for events and various maintenance therapies. Utility was differentially accredited depending upon survival and using the alternative renal replacement therapies. Outcome was predicted using post-transplant creatinine levels up to 3 years. Extensive statistical economic and sensitivity analyses were undertaken. Over the 10-year horizon, sirolimus gained 0.72 years (discounted) of functioning graft over tacrolimus, resulting in an incremental cost per year of functioning graft that was dominant. Over a 20-year time horizon, the cost effectiveness of sirolimus over tacrolimus further improved with an average discounted gain in years of a functioning graft of 1.8 years, resulting in an incremental cost-utility ratio that was also dominant. The number of haemodialysis events was 48,243 for sirolimus recipients versus 127,829 for those receiving tacrolimus and peritoneal dialysis events 40,872 versus 105,249, respectively. Similar values were obtained when real-life observational data on tacrolimus use in Cardiff, Wales were entered into the model. Using data from Cardiff, sirolimus remained dominant over tacrolimus under all scenarios. Our study suggests that sirolimus may be more cost effective than tacrolimus for the primary prevention of graft rejection in renal transplant recipients in the UK. Sirolimus was economically 'dominant' under almost all scenarios investigated. This finding was robust using statistical economic analysis and univariate sensitivity analysis.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 15-03-2014
DOI: 10.1097/01.TP.0000438202.11971.2E
Abstract: Metabolic syndrome (MS) diagnosed early after kidney transplantation is a risk factor for developing new-onset diabetes. The aim of this study was to examine whether glucose intolerance and MS identified late after transplantation influence the progression of glycemic abnormalities in kidney transplant recipients. This is a retrospective study in which 76 non-diabetic renal transplant recipients underwent oral glucose tolerance tests (OGTT) in 2005 to 2006 (baseline) and then in 2011 to 2012 (follow-up). MS was identified using the International Diabetes Federation criteria and OGTT was interpreted according to the WHO classification. At follow-up, median time from transplantation was 11.1 years (range 6.2-23.8). Mean 0-hour and 2-hour plasma glucose levels were significantly higher at follow-up compared to baseline (5.7 ± 0.7 vs. 5.9 ± 0.9 mmol/L, P=0.03 and 6.7 ± 1.9 vs. 7.5 ± 2.8 mmol/L, P=0.03, respectively). The proportion of patients with an abnormal OGTT increased from 42% at baseline to 61% at follow-up (P=0.007). Patients with MS were more likely to progress to a higher degree of glucose intolerance compared to those without MS (58% vs. 27%, P=0.01). On multivariable logistic regression adjusted for age and gender, MS was significantly associated with the progression of glucose intolerance (OR 3.5, CI 1.2-9.9, P=0.01), as was a fasting glucose greater than 5.6 mmol/L (OR 4.8, CI 1.6-14.8, P=0.006). MS is a risk factor for the progression of glucose intolerance in renal transplant recipients in the late posttransplant period. Therefore, MS has to be considered in tandem with OGTT results to assess cardiovascular risk.
Location: United Kingdom of Great Britain and Northern Ireland
Location: United Kingdom of Great Britain and Northern Ireland
Location: United States of America
Location: United Kingdom of Great Britain and Northern Ireland
Location: United Kingdom of Great Britain and Northern Ireland
No related grants have been discovered for Kesh Baboolal.