ORCID Profile
0000-0002-4948-8831
Current Organisation
National Institute for Health and Clinical Excellence
Does something not look right? The information on this page has been harvested from data sources that may not be up to date. We continue to work with information providers to improve coverage and quality. To report an issue, use the Feedback Form.
Publisher: BMJ
Date: 12-2017
DOI: 10.1136/BMJOPEN-2017-019382
Abstract: To investigate patterns of early repeat prescriptions and treatment switching over an 11-year period to estimate differences in the cost of medication wastage, dispensing fees and prescriber time for short ( days) and long (≥60 days) prescription lengths from the perspective of the National Health Service in the UK. Retrospective, multiple cohort study of primary care prescriptions from the Clinical Practice Research Datalink. Five random s les of 50 000 patients each prescribed oral drugs for (1) glucose control in type 2 diabetes mellitus (T2DM) (2) hypertension in T2DM (3) statins (lipid management) in T2DM (4) secondary prevention of myocardial infarction and (5) depression. The volume of medication wastage from early repeat prescriptions and three other types of treatment switches was quantified and costed. Dispensing fees and prescriber time were also determined. Total unnecessary costs (TUC cost of medication wastage, dispensing fees and prescriber time) associated with day and ≥60 day prescriptions, standardised to a 120-day period, were then compared. Longer prescription lengths were associated with more medication waste per prescription. However, when including dispensing fees and prescriber time, longer prescription lengths resulted in lower TUC. This finding was consistent across all five cohorts. Savings ranged from £8.38 to £12.06 per prescription per 120 days if a single long prescription was issued instead of multiple short prescriptions. Prescriber time costs accounted for the largest component of TUC. Shorter prescription lengths could potentially reduce medication wastage, but they may also increase dispensing fees and/or the time burden of issuing prescriptions.
Publisher: Elsevier BV
Date: 02-2013
Publisher: National Institute for Health and Care Research
Date: 05-2023
DOI: 10.3310/MNJY9014
Abstract: Bleeding among populations undergoing percutaneous coronary intervention or coronary artery bypass grafting and among conservatively managed patients with acute coronary syndrome exposed to different dual antiplatelet therapy and triple therapy (i.e. dual antiplatelet therapy plus an anticoagulant) has not been previously quantified. The objectives were to estimate hazard ratios for bleeding for different antiplatelet and triple therapy regimens, estimate resources and the associated costs of treating bleeding events, and to extend existing economic models of the cost-effectiveness of dual antiplatelet therapy. The study was designed as three retrospective population-based cohort studies emulating target randomised controlled trials. The study was set in primary and secondary care in England from 2010 to 2017. Participants were patients aged ≥ 18 years undergoing coronary artery bypass grafting or emergency percutaneous coronary intervention (for acute coronary syndrome), or conservatively managed patients with acute coronary syndrome. Data were sourced from linked Clinical Practice Research Datalink and Hospital Episode Statistics. Coronary artery bypass grafting and conservatively managed acute coronary syndrome: aspirin (reference) compared with aspirin and clopidogrel. Percutaneous coronary intervention: aspirin and clopidogrel (reference) compared with aspirin and prasugrel (ST elevation myocardial infarction only) or aspirin and ticagrelor. Primary outcome: any bleeding events up to 12 months after the index event. Secondary outcomes: major or minor bleeding, all-cause and cardiovascular mortality, mortality from bleeding, myocardial infarction, stroke, additional coronary intervention and major adverse cardiovascular events. The incidence of any bleeding was 5% among coronary artery bypass graft patients, 10% among conservatively managed acute coronary syndrome patients and 9% among emergency percutaneous coronary intervention patients, compared with 18% among patients prescribed triple therapy. Among coronary artery bypass grafting and conservatively managed acute coronary syndrome patients, dual antiplatelet therapy, compared with aspirin, increased the hazards of any bleeding (coronary artery bypass grafting: hazard ratio 1.43, 95% confidence interval 1.21 to 1.69 conservatively-managed acute coronary syndrome: hazard ratio 1.72, 95% confidence interval 1.15 to 2.57) and major adverse cardiovascular events (coronary artery bypass grafting: hazard ratio 2.06, 95% confidence interval 1.23 to 3.46 conservatively-managed acute coronary syndrome: hazard ratio 1.57, 95% confidence interval 1.38 to 1.78). Among emergency percutaneous coronary intervention patients, dual antiplatelet therapy with ticagrelor, compared with dual antiplatelet therapy with clopidogrel, increased the hazard of any bleeding (hazard ratio 1.47, 95% confidence interval 1.19 to 1.82), but did not reduce the incidence of major adverse cardiovascular events (hazard ratio 1.06, 95% confidence interval 0.89 to 1.27). Among ST elevation myocardial infarction percutaneous coronary intervention patients, dual antiplatelet therapy with prasugrel, compared with dual antiplatelet therapy with clopidogrel, increased the hazard of any bleeding (hazard ratio 1.48, 95% confidence interval 1.02 to 2.12), but did not reduce the incidence of major adverse cardiovascular events (hazard ratio 1.10, 95% confidence interval 0.80 to 1.51). Health-care costs in the first year did not differ between dual antiplatelet therapy with clopidogrel and aspirin monotherapy among either coronary artery bypass grafting patients (mean difference £94, 95% confidence interval –£155 to £763) or conservatively managed acute coronary syndrome patients (mean difference £610, 95% confidence interval –£626 to £1516), but among emergency percutaneous coronary intervention patients were higher for those receiving dual antiplatelet therapy with ticagrelor than for those receiving dual antiplatelet therapy with clopidogrel, although for only patients on concurrent proton pump inhibitors (mean difference £1145, 95% confidence interval £269 to £2195). This study suggests that more potent dual antiplatelet therapy may increase the risk of bleeding without reducing the incidence of major adverse cardiovascular events. These results should be carefully considered by clinicians and decision-makers alongside randomised controlled trial evidence when making recommendations about dual antiplatelet therapy. The estimates for bleeding and major adverse cardiovascular events may be biased from unmeasured confounding and the exclusion of an eligible subgroup of patients who could not be assigned an intervention. Because of these limitations, a formal cost-effectiveness analysis could not be conducted. Future work should explore the feasibility of using other UK data sets of routinely collected data, less susceptible to bias, to estimate the benefit and harm of antiplatelet interventions. This trial is registered as ISRCTN76607611. This project was funded by the National Institute for Health and Care Research (NIHR) Health Technology Assessment programme and will be published in full in Health Technology Assessment Vol. 27, No. 8. See the NIHR Journals Library website for further project information.
Publisher: BMJ
Date: 07-2019
DOI: 10.1136/BMJOPEN-2018-025700
Abstract: To identify the key drivers of cost-effectiveness for cardiovascular magnetic resonance (CMR) when patients activate the primary percutaneous coronary intervention (PPCI) pathway. Economic decision models for two patient subgroups populated from secondary sources, each with a 1 year time horizon from the perspective of the National Health Service (NHS) and personal social services in the UK. Usual care (with or without CMR) in the NHS. Patients who activated the PPCI pathway, and for Model 1: underwent an emergency coronary angiogram and PPCI, and were found to have multivessel coronary artery disease. For Model 2: underwent an emergency coronary angiogram and were found to have unobstructed coronary arteries. Model 1 (multivessel disease) compared two different ischaemia testing methods, CMR or fractional flow reserve (FFR), versus stress echocardiography. Model 2 (unobstructed arteries) compared CMR with standard echocardiography versus standard echocardiography alone. Key drivers of cost-effectiveness for CMR, incremental costs and quality-adjusted life years (QALYs) and incremental cost-effectiveness ratios. In both models, the incremental costs and QALYs between CMR (or FFR, Model 1) versus no CMR (stress echocardiography, Model 1 and standard echocardiography, Model 2) were small (CMR: −£64 (95% CI −£232 to £187)/FFR: £360 (95% CI −£116 to £844) and CMR/FFR: 0.0012 QALYs (95% CI −0.0076 to 0.0093)) and (£98 (95% CI −£199 to £488) and 0.0005 QALYs (95% CI −0.0050 to 0.0077)), respectively. The diagnostic accuracy of the tests was the key driver of cost-effectiveness for both patient groups. If CMR were introduced for all subgroups of patients who activate the PPCI pathway, it is likely that diagnostic accuracy would be a key determinant of its cost-effectiveness. Further research is needed to definitively answer whether revascularisation guided by CMR or FFR leads to different clinical outcomes in acute coronary syndrome patients with multivessel disease.
Publisher: Elsevier BV
Date: 05-2017
DOI: 10.1016/J.LUNGCAN.2016.05.024
Abstract: To identify parameters that drive the cost-effectiveness of precision medicine by comparing the use of multiplex targeted sequencing (MTS) to select targeted therapy based on tumour genomic profiles to either no further testing with chemotherapy or no further testing with best supportive care in the fourth-line treatment of metastatic lung adenocarcinoma. A combined decision tree and Markov model to compare costs, life-years, and quality-adjusted life-years over a ten-year time horizon from an Australian healthcare payer perspective. Data sources included the published literature and a population-based molecular cohort study (Cancer 2015). Uncertainty was assessed using deterministic sensitivity analyses and quantified by estimating expected value of perfect artial perfect information. Uncertainty due to technological/scientific advancement was assessed through a number of plausible future scenario analyses. Point estimate incremental cost-effective ratios indicate that MTS is not cost-effective for selecting fourth-line treatment of metastatic lung adenocarcinoma. Lower mortality rates during testing and for true positive patients, lower health state utility values for progressive disease, and targeted therapy resulting in reductions in inpatient visits, however, all resulted in more favourable cost-effectiveness estimates for MTS. The expected value to decision makers of removing all current decision uncertainty was estimated to be between AUD 5,962,843 and AUD 13,196,451, indicating that additional research to reduce uncertainty may be a worthwhile investment. Plausible future scenarios analyses revealed limited improvements in cost-effectiveness under scenarios of improved test performance, decreased costs of testing/interpretation, and no biopsy costs/adverse events. Reductions in off-label targeted therapy costs, when considered together with the other scenarios did, however, indicate more favourable cost-effectiveness of MTS. As more clinical evidence is generated for MTS, the model developed should be revisited and cost-effectiveness re-estimated under different testing scenarios to further understand the value of precision medicine and its potential impact on the overall health budget.
Publisher: Springer Science and Business Media LLC
Date: 25-09-2015
DOI: 10.1007/S10198-014-0633-1
Abstract: EQ-5D-3L scoring algorithms vary amongst countries, not only in the values of regression coefficients but also in the independent variables included in the regression model (hereafter referred to as model specification). It is unclear how much of this variation is due to differences in health state selection, the relative frequencies with which health states were valued, and model diagnostics, rather than to genuine differences in population preferences. Using aggregate data from a recent review, we noted all model specifications that were used. For each country the country's own model was re-fitted, as were all other model specifications. This was done twice: once using all valued health states for each country, and again using a common set of 17 health states for all countries. Goodness of fit was assessed using the following model diagnostics: mean absolute error (MAE), mean squared error (MSE) and rho (the Pearson correlation coefficient between predicted and observed mean utilities), both with and without leave-one-out cross-validation. Thirteen countries contributed data. Even when using a common set of health states, the preferred model varied across countries. However, choice of health states did impact the preferred model specification: when using cross-validation, the preferred specification changed in five of ten countries when moving from 17 health states to all valued health states. The relative frequency with which health states were valued had little impact on the preferred model. Variation in choices of health states to value is responsible for some, but not all, of the observed heterogeneity in model specification. Relative frequency of health state valuation and choice of model diagnostic has a limited impact on model preference, however, use of cross-validation has a substantial impact. The use of cross-validation, implemented through omitting health states rather than respondents, is recommended as one approach to assessing model fit.
Publisher: National Institute for Health and Care Research
Date: 05-2019
DOI: 10.3310/HSDR07170
Abstract: The increasing difficulty experienced by general practices in meeting patient demand is leading to new approaches being tried, including greater use of telephone consulting. To evaluate a ‘telephone first’ approach, in which all patients requesting a general practitioner (GP) appointment are asked to speak to a GP on the telephone first. The study used a controlled before-and-after (time-series) approach using national reference data sets it also incorporated economic and qualitative elements. There was a comparison between 146 practices using the ‘telephone first’ approach and control practices in England with regard to GP Patient Survey scores and secondary care utilisation (Hospital Episode Statistics). A practice manager survey was used in the ‘telephone first’ practices. There was an analysis of practice data and the patient surveys conducted in 20 practices using the ‘telephone first’ approach. Interviews were conducted with 43 patients and 49 primary care staff. The study also included an analysis of costs. Following the introduction of the ‘telephone first’ approach, the average number of face-to-face consultations in practices decreased by 38% [95% confidence interval (CI) 29% to 45% p 0.0001], whereas there was a 12-fold increase in telephone consultations (95% CI 6.3-fold to 22.9-fold p 0.0001). The average durations of consultations decreased, which, when combined with the increased number of consultations, we estimate led to an overall increase of 8% in the mean time spent consulting by GPs, although there was a large amount of uncertainty (95% CI –1% to 17% p = 0.0883). These average workload figures mask wide variation between practices, with some practices experiencing a substantial reduction in workload. Comparing ‘telephone first’ practices with control practices in England in terms of scores in the national GP Patient Survey, there was an improvement of 20 percentage points in responses to the survey question on length of time to get to see or speak to a doctor or nurse. Other responses were slightly negative. The introduction of the ‘telephone first’ approach was followed by a small (2%) increase in hospital admissions there was no initial change in accident and emergency (A& E) department attendance, but there was a subsequent small (2%) decrease in the rate of increase in A& E attendances. We found no evidence that the ‘telephone first’ approach would produce net reductions in secondary care costs. Patients and staff expressed a wide range of both positive and negative views in interviews. The ‘telephone first’ approach shows that many problems in general practice can be dealt with on the telephone. However, the approach does not suit all patients and is not a panacea for meeting demand for care, and it is unlikely to reduce secondary care costs. Future research could include identifying how telephone consulting best meets the needs of different patient groups and practices in varying circumstances and how resources can be tailored to predictable patterns of demand. We acknowledge a number of limitations to our approach. We did not conduct a systematic review of the literature, data collected from clinical administrative records were not originally designed for research purposes and for one element of the study we had no control data. In the economic analysis, we relied on practice managers’ perceptions of staff changes attributed to the ‘telephone first’ approach. In our qualitative work and patient survey, we have some evidence that the practices that participated in that element of the study had a more positive patient experience than those that did not. The National Institute for Health Research Health Services and Delivery Research programme.
Publisher: BMJ
Date: 06-2019
DOI: 10.1136/BMJOPEN-2019-029388
Abstract: ‘Real world’ bleeding in patients exposed to different regimens of dual antiplatelet therapy (DAPT) and triple therapy (TT, DAPT plus an anticoagulant) have a clinical and economic impact but have not been previously quantified. We will use linked Clinical Practice Research Datalink (CPRD) and Hospital Episode Statistics (HES) data to assemble populations eligible for three ‘target trials’ in patient groups: percutaneous coronary intervention (PCI) coronary artery bypass grafting (CABG) conservatively managed (medication only) acute coronary syndrome (ACS). Patients ≥18 years old will be eligible if, in CPRD records, they have: ≥1 year of data before the index event no prescription for DAPT or anticoagulants in the preceding 3 months a prescription for aspirin or DAPT within 2 months after discharge from the index event. The primary outcome will be any bleeding event (CPRD or HES) up to 12 months after the index event. We will estimate adjusted HR for time to first bleeding event comparing: aspirin and clopidogrel (reference) versus aspirin and prasugrel or aspirin and ticagrelor after PCI and aspirin (reference) versus aspirin and clopidogrel after CABG and ACS. We will describe rates of bleeding in patients prescribed TT (DAPT plus an anticoagulant). Potential confounders will be identified systematically using literature review, semistructured interviews with clinicians and a short survey of clinicians. We will conduct sensitivity analyses addressing the robustness of results to the study’s main limitation—that we will not be able to identify the intervention group for patients whose bleeding event occurs before a DAPT prescription in CPRD. This protocol was approved by the Independent Scientific Advisory Committee for the UK Medicines and Healthcare Products Regulatory Agency Database Research (protocol 16_126R) and the South West Cornwall and Plymouth Research Ethics Committee (17/SW/0092). The findings will be presented in peer-reviewed journals, lay summaries and briefing papers to commissioners/other stakeholders. 76607611 Pre-results.
Publisher: Informa UK Limited
Date: 21-08-2015
DOI: 10.1586/14737159.2014.929499
Abstract: The successful use of a targeted therapy is intrinsically linked to the ability of a companion diagnostic to correctly identify patients most likely to benefit from treatment. The aim of this study was to review the characteristics of companion diagnostics that are of importance for inclusion in an economic evaluation. Approaches for including these characteristics in model-based economic evaluations are compared with the intent to describe best practice methods. Five databases and government agency websites were searched to identify model-based economic evaluations comparing a companion diagnostic and subsequent treatment strategy to another alternative treatment strategy with model parameters for the sensitivity and specificity of the companion diagnostic (primary synthesis). Economic evaluations that limited model parameters for the companion diagnostic to only its cost were also identified (secondary synthesis). Quality was assessed using the Quality of Health Economic Studies instrument. 30 studies were included in the review (primary synthesis n = 12 secondary synthesis n = 18). Incremental cost-effectiveness ratios may be lower when the only parameter for the companion diagnostic included in a model is the cost of testing. Incorporating the test's accuracy in addition to its cost may be a more appropriate methodological approach. Altering the prevalence of the genetic biomarker, specific population tested, type of test, test accuracy and timing/sequence of multiple tests can all impact overall model results. The impact of altering a test's threshold for positivity is unknown as it was not addressed in any of the included studies. Additional quality criteria as outlined in our methodological checklist should be considered due to the shortcomings of standard quality assessment tools in differentiating studies that incorporate important test-related characteristics and those that do not. There is a need to refine methods for incorporating the characteristics of companion diagnostics into model-based economic evaluations to ensure consistent and transparent reimbursement decisions are made.
Publisher: Springer Science and Business Media LLC
Date: 07-11-2016
DOI: 10.1007/S40273-015-0343-2
Abstract: There is a growing appetite for large complex databases that integrate a range of personal, socio-demographic, health, genetic and financial information on in iduals. It has been argued that 'Big Data' will provide the necessary catalyst to advance both biomedical research and health economics and outcomes research. However, it is important that we do not succumb to being data rich but information poor. This paper discusses the benefits and challenges of building Big Data, analysing Big Data and making appropriate inferences in order to advance cancer care, using Cancer 2015 (a prospective, longitudinal, genomic cohort study in Victoria, Australia) as a case study. Cancer 2015 has been linked to State and Commonwealth reimbursement databases that have known limitations. This partly reflects the funding arrangements in Australia, a country with both public and private provision, including public funding of private healthcare, and partly the legislative frameworks that govern data linkage. Additionally, linkage is not without time delays and, as such, achieving a contemporaneous database is challenging. Despite these limitations, there is clear value in using linked data and creating Big Data. This paper describes the linked Cancer 2015 dataset, discusses estimation issues given the nature of the data and presents panel regression results that allow us to make possible inferences regarding which patient, disease, genomic and treatment characteristics explain variation in health expenditure.
Publisher: Springer Science and Business Media LLC
Date: 21-11-2017
DOI: 10.1038/S41525-017-0037-0
Abstract: The clinical translation of genomic sequencing is h ered by the limited information available to guide investment into those areas where genomics is well placed to deliver improved health and economic outcomes. To date, genomic medicine has achieved its greatest successes through applications to diseases that have a high genotype–phenotype correlation and high penetrance, with a near certainty that the in idual will develop the condition in the presence of the genotype. It has been anticipated that genomics will play an important role in promoting population health by targeting at-risk in iduals and reducing the incidence of highly prevalent, costly, complex diseases, with potential applications across screening, prevention, and treatment decisions. However, where primary or secondary prevention requires behavioural changes, there is currently very little evidence to support reduction in disease incidence. A better understanding of the relationship between genomic variation and complex diseases will be necessary before effective genomic risk identification and management of the risk of complex diseases in healthy in iduals can be carried out in clinical practice. Our recommended approach is that priority for genomic testing should focus on diseases where there is strong genotype–phenotype correlation, high or certain penetrance, the effects of the disease are serious and near-term, there is the potential for prevention and/or treatment, and the net costs incurred are acceptable for the health gains achieved.
Publisher: Elsevier BV
Date: 09-2018
DOI: 10.1016/J.JVAL.2018.06.016
Abstract: Next-generation sequencing (NGS) is considered to be a prominent ex le of "big data" because of the quantity and complexity of data it produces and because it presents an opportunity to use powerful information sources that could reduce clinical and health economic uncertainty at a patient level. One obstacle to translating NGS into routine health care has been a lack of clinical trials evaluating NGS technologies, which could be used to populate cost-effectiveness analyses (CEAs). A key question is whether big data can be used to partially support CEAs of NGS. This question has been brought into sharp focus with the creation of large national sequencing initiatives. In this article we summarize the main methodological and practical challenges of using big data as an input into CEAs of NGS. Our focus is on the challenges of using large observational datasets and cohort studies and linking these data to the genomic information obtained from NGS, as is being pursued in the conduct of large genomic sequencing initiatives. We propose potential solutions to these key challenges. We conclude that the use of genomic big data to support and inform CEAs of NGS technologies holds great promise. Nevertheless, health economists face substantial challenges when using these data and must be cognizant of them before big data can be confidently used to produce evidence on the cost-effectiveness of NGS.
Publisher: National Institute for Health and Care Research
Date: 12-2017
DOI: 10.3310/HTA21780
Abstract: To reduce expenditure on, and wastage of, drugs, some commissioners have encouraged general practitioners to issue shorter prescriptions, typically 28 days in length however, the evidence base for this recommendation is uncertain. To evaluate the evidence of the clinical effectiveness and cost-effectiveness of shorter versus longer prescriptions for people with stable chronic conditions treated in primary care. The design of the study comprised three elements. First, a systematic review comparing 28-day prescriptions with longer prescriptions in patients with chronic conditions treated in primary care, evaluating any relevant clinical outcomes, adherence to treatment, costs and cost-effectiveness. Databases searched included MEDLINE (PubMed), EMBASE, Cumulative Index to Nursing and Allied Health Literature, Web of Science and Cochrane Central Register of Controlled Trials. Searches were from database inception to October 2015 (updated search to June 2016 in PubMed). Second, a cost analysis of medication wastage associated with 60-day and ≥ 60-day prescriptions for five patient cohorts over an 11-year period from the Clinical Practice Research Datalink. Third, a decision model adapting three existing models to predict costs and effects of differing adherence levels associated with 28-day versus 3-month prescriptions in three clinical scenarios. In the systematic review, from 15,257 unique citations, 54 full-text papers were reviewed and 16 studies were included, five of which were abstracts and one of which was an extended conference abstract. None was a randomised controlled trial: 11 were retrospective cohort studies, three were cross-sectional surveys and two were cost studies. No information on health outcomes was available. An exploratory meta-analysis based on six retrospective cohort studies suggested that lower adherence was associated with 28-day prescriptions (standardised mean difference –0.45, 95% confidence interval –0.65 to –0.26). The cost analysis showed that a statistically significant increase in medication waste was associated with longer prescription lengths. However, when accounting for dispensing fees and prescriber time, longer prescriptions were found to be cost saving compared with shorter prescriptions. Prescriber time was the largest component of the calculated cost savings to the NHS. The decision modelling suggested that, in all three clinical scenarios, longer prescription lengths were associated with lower costs and higher quality-adjusted life-years. The available evidence was found to be at a moderate to serious risk of bias. All of the studies were conducted in the USA, which was a cause for concern in terms of generalisability to the UK. No evidence of the direct impact of prescription length on health outcomes was found. The cost study could investigate prescriptions issued only it could not assess patient adherence to those prescriptions. Additionally, the cost study was based on products issued only and did not account for underlying patient diagnoses. A lack of good-quality evidence affected our decision modelling strategy. Although the quality of the evidence was poor, this study found that longer prescriptions may be less costly overall, and may be associated with better adherence than 28-day prescriptions in patients with chronic conditions being treated in primary care. There is a need to more reliably evaluate the impact of differing prescription lengths on adherence, on patient health outcomes and on total costs to the NHS. The priority should be to identify patients with particular conditions or characteristics who should receive shorter or longer prescriptions. To determine the need for any further research, an expected value of perfect information analysis should be performed. This study is registered as PROSPERO CRD42015027042. The National Institute for Health Research Health Technology Assessment programme.
Publisher: BMJ
Date: 04-2020
DOI: 10.1136/BMJOPEN-2019-035020
Abstract: People with type 2 diabetes (T2D) can improve glycaemic control or even achieve remission through weight loss and reduce their use of medication and risk of cardiovascular disease. The Glucose Lowering through Weight management (GLoW) trial will evaluate whether a tailored diabetes education and behavioural weight management programme (DEW) is more effective and cost-effective than a diabetes education (DE) programme in helping people with overweight or obesity and a recent diagnosis of T2D to lower their blood glucose, lose weight and improve other markers of cardiovascular risk. This study is a pragmatic, randomised, single-blind, parallel group, two-arm, superiority trial. We will recruit 576 adults with body mass index kg/m 2 and diagnosis of T2D in the past 3 years and randomise them to a tailored DEW or a DE programme. Participants will attend measurement appointments at a local general practitioner practice or research centre at baseline, 6 and 12 months. The primary outcome is 12-month change in glycated haemoglobin. The effect of the intervention on the primary outcome will be estimated and tested using a linear regression model (analysis of covariance) including randomisation group and adjusted for baseline value of the outcome and the randomisation stratifiers. Participants will be included in the group to which they were randomised, under the intention-to-treat principle. Secondary outcomes include 6-month and 12-month changes in body weight, body fat percentage, systolic and diastolic blood pressure and lipid profile probability of achieving good glycaemic control probability of achieving remission from diabetes probability of losing 5% and 10% body weight and modelled cardiovascular risk (UKPDS). An intention-to-treat within-trial cost-effectiveness analysis will be conducted from NHS and societal perspectives using participant-level data. Qualitative interviews will be conducted with participants to understand why and how the programme achieved its results and how participants manage their weight after the programme ends. Ethical approval was received from East of Scotland Research Ethics Service on 15 May 2018 (18/ES/0048). This protocol (V.3) was approved on 19 June 2019. Findings will be published in peer-reviewed scientific journals and communicated to other stakeholders as appropriate. ISRCTN18399564 .
Publisher: Elsevier BV
Date: 07-2013
DOI: 10.1016/J.JTCVS.2012.06.018
Abstract: The primary analysis estimated the cost-effectiveness of transfemoral transcatheter aortic valve implantation (Edwards SAPIEN heart valve Edwards Lifesciences LLC, Irvine, Calif) compared with standard management in inoperable patients with severe, symptomatic aortic stenosis. The secondary analysis estimated the cost-effectiveness of transcatheter aortic valve implantation (transfemoral or transapical approaches) (SAPIEN heart valve) compared with surgical aortic valve replacement in operable patients with severe, symptomatic aortic stenosis. A combined decision tree and Markov model was developed to compare costs, life-years, and quality-adjusted life-years over a 20-year time horizon from the Canadian health-care payer perspective. The Placement of Aortic Transcatheter Valves trial provided rates of postoperative complications and mortality. Costs were derived from the Ontario Case Costing Initiative. Comprehensive sensitivity analyses were used to explore the impact of uncertainty on the cost-effective estimates. In the primary analysis, comparing transfemoral transcatheter aortic valve implantation and standard management resulted in incremental cost-effectiveness ratios of $36,458/life-year and $51,324/quality-adjusted life-year. In the secondary analysis, transcatheter aortic valve implantation (transfemoral or transapical) and surgical aortic valve replacement were compared, resulting in an incremental cost-effectiveness ratio of $870,143/life-year and transcatheter aortic valve implantation being dominated by surgical aortic valve replacement when comparing quality-adjusted life-years. Deterministic sensitivity analysis for the primary analysis identified the procedural costs and 1-year mortality rates of both transfemoral transcatheter aortic valve implantation and standard management to be the most sensitive parameters in the model, whereas results from the secondary analysis were largely unchanged. Removal of long-term complications in both analyses led to more favorable incremental cost-effectiveness ratios for transcatheter aortic valve implantation. This economic evaluation suggested that transfemoral transcatheter aortic valve implantation was a cost-effective option compared with standard management for inoperable patients with severe, symptomatic aortic stenosis, but it might not be a cost-effective treatment compared with surgical aortic valve replacement for operable patients.
Publisher: MDPI AG
Date: 17-09-2019
DOI: 10.3390/NU11092236
Abstract: The objective of this trial was to test two promising front-of-pack nutrition labels, 1) the United Kingdom’s Multiple Traffic Lights (MTL) label and 2) France’s Nutri-Score (NS), relative to a no-label control. We hypothesized that both labels would improve diet quality but NS would be more effective due to its greater simplicity. We tested this hypothesis via an online grocery store using a 3 × 3 crossover (within-person) design with 154 participants. Outcomes assessed via within person regression models include a modified Alternative Healthy Eating Index (AHEI)-2010 (primary), average Nutri-Score, calories purchased, and singular measures of diet quality of purchase orders. Results show that both labels significantly improve modified AHEI scores relative to Control but neither is statistically superior using this measure. NS performed statistically better than MTL and Control based on average Nutri-Score, yet, unlike MTL it did not statistically reduce calories or sugar from beverages. This suggest that NS may be preferred if the goal is to improve overall diet quality but, because calories are clearly displayed on the label, MTL may perform better if the goal is to reduce total energy intake.
Publisher: Elsevier BV
Date: 2016
Publisher: Elsevier BV
Date: 04-2020
Publisher: Future Medicine Ltd
Date: 2015
DOI: 10.2217/PME.14.81
Publisher: Springer Science and Business Media LLC
Date: 26-05-2017
Publisher: SAGE Publications
Date: 22-03-2014
Abstract: Background. There has been a growing interest around the world in developing country-specific scoring algorithms for the EQ-5D. This study systematically reviews all existing EQ-5D valuation studies to highlight their strengths and limitations, explores heterogeneity in observed utilities using meta-regression, and proposes a methodological checklist for reporting EQ-5D valuation studies. Methods. We searched Medline, EMBASE, the National Health Service Economic Evaluation Database (NHS EED) via Wiley’s Cochrane Library, and Wiley’s Health Economic Evaluation Database from inception through November 2012, as well as bibliographies of key papers and the EuroQol Plenary Meeting Proceedings from 1991 to 2012 for English-language reports of EQ-5D valuation studies. Two reviewers independently screened the titles and abstracts for relevance. Three reviewers performed data extraction and compared the characteristics and scoring algorithms developed in the included valuation studies. Results. Of the 31 studies included in the review, 19 used the time trade-off (TTO) technique, 10 used the visual analogue scale (VAS) technique, and 2 used both TTO and VAS. Most studies included respondents from the general population selected by random or quota s ling and used face-to-face interviews or postal surveys. Studies valued between 7 and 198 total states, with 1–23 states valued per respondent. Different model specifications have been proposed for scoring. Some s le or demographic factors, including gender, education, percentage urban population, and national health care expenditure, were associated with differences in observed utilities for moderate or severe health states. Conclusions. EQ-5D valuation studies conducted to date have varied widely in their design and in the resulting scoring algorithms. Therefore, we propose the Checklist for Reporting Valuation Studies of the EQ-5D (CREATE) for those conducting valuation studies.
Publisher: Elsevier BV
Date: 2012
DOI: 10.1016/J.IJROBP.2010.08.060
Abstract: To systematically review the effectiveness and safety of 5-hydroxytryptamine-3 receptor antagonists (5-HT3 RAs) compared with other antiemetic medication or placebo for prophylaxis of radiation-induced nausea and vomiting. We searched the following electronic databases: MEDLINE, Embase, the Cochrane Central Register of Controlled Clinical Trials, and Web of Science. We also hand-searched reference lists of included studies. Randomized, controlled trials that compared a 5-HT3 RA with another antiemetic medication or placebo for preventing radiation-induced nausea and vomiting were included. We excluded studies recruiting patients receiving concomitant chemotherapy. When appropriate, meta-analysis was conducted using Review Manager (v5) software. Relative risks were calculated using inverse variance as the statistical method under a random-effects model. We assessed the quality of evidence by outcome using the Grading of Recommendations Assessment, Development, and Evaluation approach. Eligibility screening of 47 articles resulted in 9 included in the review. The overall methodologic quality was moderate. Meta-analysis of 5-HT3 RAs vs. placebo showed significant benefit for 5-HT3 RAs (relative risk [RR] 0.70 95% confidence interval [CI] 0.57-0.86 for emesis RR 0.84, 95% CI 0.73-0.96 for nausea). Meta-analysis comparing 5-HT3 RAs vs. metoclopramide showed a significant benefit of the 5-HT3 RAs for emetic control (RR 0.27, 95% CI 0.15-0.47). 5-Hydroxytryptamine-3 RAs are superior to placebo and other antiemetics for prevention of emesis, but little benefit was identified for nausea prevention. 5-Hydroxytryptamine-3 RAs are suggested for prevention of emesis. Limited evidence was found regarding delayed emesis, adverse events, quality of life, or need for rescue medication. Future randomized, controlled trials should evaluate different 5-HT3 antiemetics and new agents with novel mechanisms of action such at the NK(1) receptor antagonists to determine the most effective drug. Delayed nausea and vomiting should be a focus of future study, perhaps concentrating on the palliative cancer population.
Publisher: Elsevier BV
Date: 07-2018
Publisher: Springer Science and Business Media LLC
Date: 27-10-2018
DOI: 10.1007/S11695-018-3553-9
Abstract: There is a growing interest in comparing the effectiveness and costs of alternative forms of bariatric surgery. We aimed to examine the per-patient, procedural costs of Roux-en-Y gastric bypass (RYGB), sleeve gastrectomy (SG) and adjustable gastric banding (AGB) in the United Kingdom. Multi-centre (two National Health Service NHS and one private hospital) micro-costing, using a time-and-motion study. Prospective collection of surgery times, staff quantities, equipment, instruments and consumables for 12 patients (four RYGB, five SG, three AGB) from patients' first surgeon interaction on the day of surgery to departure from the theatre recovery area. Costs were attached to quantities and mean costs compared. Sensitivity and scenario analyses assessed the impact of varying surgery inputs and consideration of additional plausible factors respectively on total costs. Mean procedural costs were £5002 for RYGB, £4306 for SG and £2527 for AGB. Varying staff seniority or altering procedure times had small impacts on costs (± 4-6%). Reducing prices of consumables by 20% reduced costs by 10-13%. Accounting for differences in surgical technique by altering the number of staple reloads used impacted costs by ± 7-10%. Adjusted total costs from scenario analyses were similar to NHS tariffs for RYGB and SG (difference of £51 and -£119 respectively) but were much lower for AGB (difference of £1982). These detailed costs will allow for more precise reimbursement of bariatric surgery and support comprehensive assessments of cost-effectiveness. Additional work to investigate costs of post-surgical care, re-operations and life-long support received by patients following surgery is required.
Publisher: Springer Science and Business Media LLC
Date: 20-09-2018
Publisher: BMJ
Date: 24-01-2020
DOI: 10.1136/JMEDGENET-2019-106445
Abstract: This study provides an integrated assessment of the economic and social impacts of genomic sequencing for the detection of monogenic disorders resulting in intellectual disability (ID). Multiple knowledge bases were cross-referenced and analysed to compile a reference list of monogenic disorders associated with ID. Multiple literature searches were used to quantify the health and social costs for the care of people with ID. Health and social expenditures and the current cost of whole-exome sequencing and whole-genome sequencing were quantified in relation to the more common causes of ID and their impact on lifespan. On average, in iduals with ID incur annual costs in terms of health costs, disability support, lost income and other social costs of US$172 000, accumulating to many millions of dollars over a lifetime. The diagnosis of monogenic disorders through genomic testing provides the opportunity to improve the diagnosis and management, and to reduce the costs of ID through informed reproductive decisions, reductions in unproductive diagnostic tests and increasingly targeted therapies.
Publisher: Elsevier BV
Date: 2020
DOI: 10.1016/J.LUNGCAN.2019.11.022
Abstract: There is an expanding list of therapeutically relevant biomarkers for non-small cell lung cancer (NSCLC), and molecular profiling at diagnosis is paramount. Tissue attrition in scaling traditional single biomarker assays from small biopsies is an increasingly encountered problem. We sought to compare the performance of targeted next-generation sequencing (NGS) panels with traditional assays and correlate the mutational landscape with PD-L1 status in Singaporean patients. We identified consecutive patients diagnosed between Jan 2016 to Sep 2017 with residual tissue after standard molecular testing. Tissue s les were tested using a targeted NGS panel for DNA alterations (29 selected genes including BRAF, EGFR, ERBB2 and TP53) and an RNA fusion panel (ALK, ROS1 and RET). PD-L1 immunohistochemistry was also performed. A cost-effectiveness analysis of NGS compared to standard molecular testing was conducted. A total of 174 s les were evaluated: PD-L1 (n = 169), NGS DNA panel (n = 173) and RNA fusion (n = 119) testing. Median age was 68 years, 53 % were male, 58 % were never smokers, 85 % were Chinese, 66 % had stage IV disease and 95 % had adenocarcinoma histology. In patients profiled with NGS on DNA, EGFR (56 %), KRAS (14 %), BRAF (2 %) and ERBB2 (1 %) mutations were found. RNA fusion testing revealed fusions in ALK (6 %), RET (3 %) and ROS1 (1 %). Cost-effectiveness analysis demonstrated that compared to sequential testing in EGFR negative patients, upfront NGS testing would result in an additional 1 % of patients with actionable alterations for targeted therapy being identified without significant increases in testing cost or turnaround time. This study demonstrates that even in an EGFR mutant predominant population, upfront NGS represents a feasible, cost-effective method of diagnostic molecular profiling compared with sequential testing strategies. Our results support the implementation of diagnostic NGS in non-squamous NSCLC in Asia to allow patients access to the most appropriate personalized therapy.
Publisher: SAGE Publications
Date: 18-09-2018
Abstract: Objectives. To assess the external validity of mapping algorithms for predicting EQ-5D-3L utility values from EORTC QLQ-C30 responses not previously validated and to assess whether statistical models not previously applied are better suited for mapping the EORTC QLQ-C30 to the EQ-5D-3L. Methods. In total, 3866 observations for 1719 patients from a longitudinal study (Cancer 2015) were used to validate existing algorithms. Predictive accuracy was compared to previously validated algorithms using root mean squared error, mean absolute error across the EQ-5D-3L range, and for 10 tumor-type specific s les as well as using differences between estimated quality-adjusted life years. Thirteen new algorithms were estimated using a subset of the Cancer 2015 data (3203 observations for 1419 patients) applying various linear, response mapping, beta, and mixture models. Validation was performed using 2 data sets composed of patients with varying disease severity not used in the estimation and all available algorithms ranked on their performance. Results. None of the 5 existing algorithms offer an improvement in predictive accuracy over preferred algorithms from previous validation studies. Of the newly estimated algorithms, a 2-part beta model performed the best across the validation criteria and in data sets composed of patients with different levels of disease severity. Validation results did, however, vary widely between the 2 data sets, and the most accurate algorithm appears to depend on health state severity as the distribution of observed EQ-5D-3L values varies. Linear models performed better for patients in relatively good health, whereas beta, mixture, and response mapping models performed better for patients in worse health. Conclusion. The most appropriate mapping algorithm to apply in practice may depend on the disease severity of the patient s le whose utility values are being predicted.
Publisher: Future Medicine Ltd
Date: 09-2013
DOI: 10.2217/PGS.13.142
Abstract: The development of genomic technologies has ushered in the era of pharmacogenomics. However, discoveries and clinical use of targeted therapies are still in their infancy. A focus on monogenic pharmacogenetic traits may contribute to this lack of progress. Variation in drug response is likely a complex paradigm involving not only genomic factors but proteomic, metabolomic and epigenomic influences. The incorporation of these omics elements into pharmaceutical development and clinical decision-making will ultimately require the use of methods to determine clinical and economic value. Current methodologies and guidelines for determining clinical effectiveness and cost–effectiveness may have limited applicability to the increasingly personalized nature of omics treatment strategies. Using ex les from oncology, this article argues for the adaptation and tailoring of three existing methods for ensuring development and clinical use of multiomics-guided therapies that are effective, safe and offer value for money.
Location: United Kingdom of Great Britain and Northern Ireland
Location: United Kingdom of Great Britain and Northern Ireland
Location: United Kingdom of Great Britain and Northern Ireland
Start Date: 2017
End Date: 2021
Funder: Programme Grants for Applied Research
View Funded ActivityStart Date: 2014
End Date: 2014
Funder: Garvan Institute of Medical Research
View Funded ActivityStart Date: 2019
End Date: 2021
Funder: Singapore Millennium Foundation
View Funded Activity