ORCID Profile
0000-0002-2358-2440
Current Organisations
University of Cambridge
,
University of Oxford
,
Medical Journal of Australia
,
Open Access Australasia (Previously Australasian Open Access Strategy Group)
Does something not look right? The information on this page has been harvested from data sources that may not be up to date. We continue to work with information providers to improve coverage and quality. To report an issue, use the Feedback Form.
Publisher: Springer Science and Business Media LLC
Date: 11-09-2017
Publisher: WHO Press
Date: 05-2006
Publisher: Public Library of Science (PLoS)
Date: 16-07-2020
Publisher: Medicinska Naklada d.o.o.
Date: 10-2023
DOI: 10.15836/CCAR2024.61
Publisher: Public Library of Science (PLoS)
Date: 24-09-2013
Publisher: Public Library of Science (PLoS)
Date: 19-06-2012
Publisher: Public Library of Science (PLoS)
Date: 19-10-2004
Publisher: Informa UK Limited
Date: 26-10-2023
Publisher: Public Library of Science (PLoS)
Date: 24-02-2009
Publisher: Informa UK Limited
Date: 27-10-2023
Publisher: Elsevier BV
Date: 06-2002
Publisher: Public Library of Science (PLoS)
Date: 26-07-2005
Publisher: Public Library of Science (PLoS)
Date: 15-07-2008
Publisher: Public Library of Science (PLoS)
Date: 27-12-2012
Publisher: Elsevier BV
Date: 10-1999
Publisher: Public Library of Science (PLoS)
Date: 29-09-2009
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 2012
DOI: 10.1016/J.IJSU.2011.10.001
Abstract: Overwhelming evidence shows the quality of reporting of randomised controlled trials (RCTs) is not optimal. Without transparent reporting, readers cannot judge the reliability and validity of trial findings nor extract information for systematic reviews. Recent methodological analyses indicate that inadequate reporting and design are associated with biased estimates of treatment effects. Such systematic error is seriously damaging to RCTs, which are considered the gold standard for evaluating interventions because of their ability to minimise or avoid bias. A group of scientists and editors developed the CONSORT (Consolidated Standards of Reporting Trials) statement to improve the quality of reporting of RCTs. It was first published in 1996 and updated in 2001. The statement consists of a checklist and flow diagram that authors can use for reporting an RCT. Many leading medical journals and major international editorial groups have endorsed the CONSORT statement. The statement facilitates critical appraisal and interpretation of RCTs. During the 2001 CONSORT revision, it became clear that explanation and elaboration of the principles underlying the CONSORT statement would help investigators and others to write or appraise trial reports. A CONSORT explanation and elaboration article was published in 2001 alongside the 2001 version of the CONSORT statement. After an expert meeting in January 2007, the CONSORT statement has been further revised and is published as the CONSORT 2010 Statement. This update improves the wording and clarity of the previous checklist and incorporates recommendations related to topics that have only recently received recognition, such as selective outcome reporting bias. This explanatory and elaboration document-intended to enhance the use, understanding, and dissemination of the CONSORT statement-has also been extensively revised. It presents the meaning and rationale for each new and updated checklist item providing ex les of good reporting and, where possible, references to relevant empirical studies. Several ex les of flow diagrams are included. The CONSORT 2010 Statement, this revised explanatory and elaboration document, and the associated website (www.consort-statement.org) should be helpful resources to improve reporting of randomised trials.
Publisher: F1000 Research Ltd
Date: 06-11-2017
DOI: 10.12688/F1000RESEARCH.13060.1
Abstract: Academic publishing is evolving and our current system of correcting research post-publication is failing, both ideologically and practically. It does not encourage researchers to engage in necessary post-publication changes in a consistent way. Worse yet, post-publication ‘updates’ can be misconstrued as punishments or admissions of misconduct. We propose a different model that publishers of research can apply to the content they publish, ensuring that any post-publication amendments are seamless, transparent and propagated to all the countless places online where descriptions of research appear. At the center of our proposal is use of the neutral term “amendment” to describe all forms of post-publication change to an article. We lay out a straightforward and consistent process that applies to each of three types of amendment that differ only in the extent to which the study is amended: minor, major, and complete. This proposed system supports the dynamic nature of the research process itself as researchers continue to refine or extend the work, and removes the emotive climate particularly associated with retractions and corrections to published work. It allows researchers to cite and share the most up-to-date and complete versions of articles with certainty, and gives decision makers access to the most up-to-date information. Crucially, however, we do not underestimate the importance of investigations of potential misconduct. This proposal allows two interrelated processes - amendment of articles and investigation of misconduct - to be uncoupled temporally, allowing a more rapid correction of the literature at a journal while institutional investigations take place, without either having to follow the others’ timeline.
Publisher: Wiley
Date: 04-06-2023
DOI: 10.5694/MJA2.51976
Publisher: Springer International Publishing
Date: 2018
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 2011
Publisher: Public Library of Science (PLoS)
Date: 31-07-2007
Publisher: BMJ
Date: 19-01-2009
DOI: 10.1136/BMJ.A3152
Publisher: Elsevier BV
Date: 2002
Publisher: Informa UK Limited
Date: 30-10-2023
Publisher: Public Library of Science (PLoS)
Date: 29-03-2011
Publisher: Springer Science and Business Media LLC
Date: 21-01-2016
Publisher: BMJ
Date: 07-03-2014
DOI: 10.1136/BMJ.G1687
Abstract: Without a complete published description of interventions, clinicians and patients cannot reliably implement interventions that are shown to be useful, and other researchers cannot replicate or build on research findings. The quality of description of interventions in publications, however, is remarkably poor. To improve the completeness of reporting, and ultimately the replicability, of interventions, an international group of experts and stakeholders developed the Template for Intervention Description and Replication (TIDieR) checklist and guide. The process involved a literature review for relevant checklists and research, a Delphi survey of an international panel of experts to guide item selection, and a face to face panel meeting. The resultant 12 item TIDieR checklist (brief name, why, what (materials), what (procedure), who provided, how, where, when and how much, tailoring, modifications, how well (planned), how well (actual)) is an extension of the CONSORT 2010 statement (item 5) and the SPIRIT 2013 statement (item 11). While the emphasis of the checklist is on trials, the guidance is intended to apply across all evaluative study designs. This paper presents the TIDieR checklist and guide, with an explanation and elaboration for each item, and ex les of good reporting. The TIDieR checklist and guide should improve the reporting of interventions and make it easier for authors to structure accounts of their interventions, reviewers and editors to assess the descriptions, and readers to use the information.
Publisher: Public Library of Science (PLoS)
Date: 29-01-2013
Publisher: Springer Science and Business Media LLC
Date: 16-03-2017
Publisher: Elsevier BV
Date: 2016
DOI: 10.1038/JID.2015.355
Publisher: Public Library of Science (PLoS)
Date: 22-12-2009
Publisher: Wiley
Date: 26-10-2023
DOI: 10.5694/MJA2.52148
Publisher: Informa UK Limited
Date: 26-10-2023
Publisher: Elsevier BV
Date: 11-2016
Publisher: Informa UK Limited
Date: 30-10-2023
Publisher: Public Library of Science (PLoS)
Date: 26-04-2005
Publisher: Public Library of Science (PLoS)
Date: 27-09-2011
Publisher: Public Library of Science (PLoS)
Date: 30-07-2013
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 2014
Publisher: Elsevier BV
Date: 12-2009
Publisher: Wiley
Date: 19-09-2023
DOI: 10.1002/PATH.6203
Publisher: Elsevier BV
Date: 03-2003
Publisher: Wiley
Date: 19-02-2023
DOI: 10.5694/MJA2.51846
Publisher: Public Library of Science (PLoS)
Date: 29-03-2005
Publisher: Cold Spring Harbor Laboratory
Date: 20-03-2017
DOI: 10.1101/118356
Abstract: Academic publishing is evolving and our current system of correcting research post-publication is failing, both ideologically and practically. It does not encourage researchers to engage in consistent post-publication changes. Worse yet, post-publication ‘updates’ are misconstrued as punishments or admissions of guilt. We propose a different model that publishers of research can apply to the content they publish, ensuring that any post-publication amendments are seamless, transparent and propagated to all the countless places online where descriptions of research appear. At the center, the neutral term “amendment” describes all forms of post-publication change to an article. We lay out a straightforward and consistent process that applies to each of the three types of amendments: insubstantial, substantial, and complete. This proposed system supports the dynamic nature of the research process itself as researchers continue to refine or extend the work, removing the emotive climate particularly associated with retractions and corrections to published work. It allows researchers to cite and share the correct versions of articles with certainty, and for decision makers to have access to the most up to date information.
Publisher: Elsevier BV
Date: 2000
Publisher: Public Library of Science (PLoS)
Date: 26-03-2013
Publisher: Elsevier BV
Date: 11-2001
Publisher: Informa UK Limited
Date: 30-10-2023
Publisher: Public Library of Science (PLoS)
Date: 26-07-2011
Publisher: Wiley
Date: 10-2023
DOI: 10.5694/MJA2.52098
Publisher: Informa UK Limited
Date: 30-10-2023
Publisher: Informa UK Limited
Date: 27-10-2023
Publisher: Informa UK Limited
Date: 30-10-2023
Publisher: Wiley
Date: 05-02-2023
DOI: 10.5694/MJA2.51840
Publisher: Public Library of Science (PLoS)
Date: 25-08-2009
Publisher: Public Library of Science (PLoS)
Date: 28-02-2012
Publisher: Public Library of Science (PLoS)
Date: 06-06-2006
Publisher: Public Library of Science (PLoS)
Date: 31-05-2005
Publisher: Public Library of Science (PLoS)
Date: 26-08-2008
Publisher: Public Library of Science (PLoS)
Date: 21-07-2009
Publisher: Wiley
Date: 16-07-2023
DOI: 10.5694/MJA2.52015
Publisher: Informa UK Limited
Date: 27-10-2023
Publisher: BMJ
Date: 26-10-2017
DOI: 10.1136/BMJ.J4819
Publisher: Center for Open Science
Date: 30-01-2020
Abstract: The Community of Open Scholarship Grassroots Networks (COSGN), includes 120 grassroots networks, representing virtually every region of the world and every research discipline. These networks communicate and coordinate on topics of common interest. We propose, using an NSF 19-501 Full-Scale implementation grant, to formalize governance and coordination of the networks to maximize impact and establish standard practices for sustainability. In the project period, we will increase the capacity of COSGN to advance the research and community goals of the participating networks in idually and collectively, and establish governance, succession planning, shared resources, andcommunication pathways to ensure an active, community-sustained network of networks. By the end of the project period, we will have established a self-sustaining network of networks that leverages disciplinary and regional ersity, actively collaborates across networks for grassroots organizing, and shares resources for maximum impact on culture change for open scholarship.
Publisher: Public Library of Science (PLoS)
Date: 26-02-2008
Publisher: Oxford University Press (OUP)
Date: 08-2017
Publisher: Cold Spring Harbor Laboratory
Date: 10-12-2022
DOI: 10.1101/2022.12.08.519666
Abstract: Research institutions and researchers have become increasingly concerned about poor research reproducibility and replicability, and research waste more broadly. Research institutions play an important role and understanding their intervention options is important. This review aims to identify and classify possible interventions to improve research quality, reduce waste, and improve reproducibility and replicability within research-performing institutions. Taxonomy development steps: 1) use of an exemplar paper of journal-level research quality improvement interventions, 2) 2-stage search in PubMed using seed and exemplar articles, and forward and backward citation searching to identify articles evaluating or describing research quality improvement, 3) elicited draft taxonomy feedback from researchers at an open-sciences conference workshop, and 4) cycles of revisions from the research team. The search identified 11 peer-reviewed articles on relevant interventions. Overall, 93 interventions were identified from peer-review literature and researcher reporting. Interventions covered before, during, and after study conduct research stages and whole of institution. Types of intervention included: Tools, Education & Training, Incentives, Modelling & Mentoring, Review & Feedback, Expert involvement, and Policies & Procedures. Identified areas for research institutions to focus on to improve research quality and for further research includes improving incentives to implement quality research practices, evaluating current interventions, encourage no- or low-cost/high-benefit interventions, examine institution research culture, and encourage mentor-mentee relationships.
Publisher: Wiley
Date: 17-09-2023
DOI: 10.5694/MJA2.52087
Publisher: Public Library of Science (PLoS)
Date: 25-10-2011
Publisher: Public Library of Science (PLoS)
Date: 22-04-2014
Publisher: Public Library of Science (PLoS)
Date: 21-06-2010
Publisher: Public Library of Science (PLoS)
Date: 28-12-2004
Publisher: Journal of Chinese Integrative Medicine Press
Date: 15-09-2009
DOI: 10.3736/JCIM20090918
Publisher: Public Library of Science (PLoS)
Date: 27-01-2009
Publisher: Public Library of Science (PLoS)
Date: 21-12-2010
Publisher: Springer Science and Business Media LLC
Date: 02-02-2016
Publisher: Wiley
Date: 18-08-2023
DOI: 10.1002/IJGO.15033
Publisher: Maad Rayan Publishing Company
Date: 02-08-2023
Publisher: Informa UK Limited
Date: 26-10-2023
Publisher: Elsevier BV
Date: 02-2002
Publisher: Informa UK Limited
Date: 31-10-2023
Publisher: Public Library of Science (PLoS)
Date: 27-08-2013
Publisher: Public Library of Science (PLoS)
Date: 30-06-2011
Publisher: Ferrata Storti Foundation (Haematologica)
Date: 2010
Publisher: Informa UK Limited
Date: 31-10-2023
Publisher: Elsevier BV
Date: 10-2002
Publisher: Public Library of Science (PLoS)
Date: 14-10-2008
Publisher: Wiley
Date: 27-08-2023
DOI: 10.5694/MJA2.52074
Publisher: Public Library of Science (PLoS)
Date: 28-04-2009
Publisher: Informa UK Limited
Date: 27-10-2023
Publisher: Wiley
Date: 03-09-2023
DOI: 10.5694/MJA2.52070
Publisher: Wiley
Date: 08-09-2023
DOI: 10.1002/CJP2.341
Publisher: AOSIS
Date: 27-09-2023
Publisher: Springer Science and Business Media LLC
Date: 10-2023
Publisher: Informa UK Limited
Date: 26-10-2023
Publisher: Informa UK Limited
Date: 27-10-2023
Publisher: Public Library of Science (PLoS)
Date: 31-05-2011
Publisher: Front Matter
Date: 18-01-2022
DOI: 10.54900/REZF20N-589A1B5-68NCG
Abstract: If anyone thought that 2022 was going to be a time of peace and harmony in open access, some of the last salvos of 2021 will surely have put that to rest. 2021 was the year in which Plan S requirements kicked in, when transformative agreements were negotiated more widely than ever before and when publishers really showed their colours in the way they moderated their actions and, crucially, their language to describe and shape the open access world they would like to see. Undoubtedly, the arcane
Publisher: Public Library of Science (PLoS)
Date: 18-01-2005
Publisher: Public Library of Science (PLoS)
Date: 27-05-2008
Publisher: Wiley
Date: 29-12-2022
DOI: 10.5694/MJA2.51817
Publisher: Public Library of Science (PLoS)
Date: 31-05-2005
Publisher: Wiley
Date: 15-10-2023
DOI: 10.5694/MJA2.52108
Publisher: Public Library of Science (PLoS)
Date: 31-05-2005
Publisher: Public Library of Science (PLoS)
Date: 25-06-0001
Publisher: Public Library of Science (PLoS)
Date: 25-09-2007
Publisher: Elsevier BV
Date: 11-2003
Publisher: BMJ
Date: 06-10-1990
Publisher: Elsevier BV
Date: 05-2003
Publisher: Public Library of Science (PLoS)
Date: 25-09-2012
Publisher: Public Library of Science (PLoS)
Date: 30-04-2013
Publisher: BMJ
Date: 06-06-2014
DOI: 10.1136/BMJ.G3768
Publisher: Informa UK Limited
Date: 28-10-2023
Publisher: Informa UK Limited
Date: 30-10-2023
Publisher: Xia & He Publishing
Date: 07-08-2023
Publisher: Public Library of Science (PLoS)
Date: 24-04-2012
Publisher: Public Library of Science (PLoS)
Date: 26-05-2009
Publisher: Public Library of Science (PLoS)
Date: 30-10-2012
Publisher: Public Library of Science (PLoS)
Date: 30-10-2012
Publisher: Public Library of Science (PLoS)
Date: 28-05-2013
Publisher: Public Library of Science (PLoS)
Date: 30-09-2008
Publisher: Informa UK Limited
Date: 29-01-2016
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 13-01-2015
DOI: 10.1161/CIRCULATIONAHA.114.014508
Abstract: Prediction models are developed to aid health care providers in estimating the probability or risk that a specific disease or condition is present (diagnostic models) or that a specific event will occur in the future (prognostic models), to inform their decision making. However, the overwhelming evidence shows that the quality of reporting of prediction model studies is poor. Only with full and clear reporting of information on all aspects of a prediction model can risk of bias and potential usefulness of prediction models be adequately assessed. The Transparent Reporting of a multivariable prediction model for In idual Prognosis Or Diagnosis (TRIPOD) Initiative developed a set of recommendations for the reporting of studies developing, validating, or updating a prediction model, whether for diagnostic or prognostic purposes. This article describes how the TRIPOD Statement was developed. An extensive list of items based on a review of the literature was created, which was reduced after a Web-based survey and revised during a 3-day meeting in June 2011 with methodologists, health care professionals, and journal editors. The list was refined during several meetings of the steering group and in e-mail discussions with the wider group of TRIPOD contributors. The resulting TRIPOD Statement is a checklist of 22 items, deemed essential for transparent reporting of a prediction model study. The TRIPOD Statement aims to improve the transparency of the reporting of a prediction model study regardless of the study methods used. The TRIPOD Statement is best used in conjunction with the TRIPOD explanation and elaboration document. To aid the editorial process and readers of prediction model studies, it is recommended that authors include a completed checklist in their submission (also available at www.tripod-statement.org ).
Publisher: Wiley
Date: 02-04-2023
DOI: 10.5694/MJA2.51893
Publisher: Wiley
Date: 08-08-2023
DOI: 10.1111/INR.12875
Publisher: Elsevier BV
Date: 08-2003
Publisher: Wiley
Date: 30-04-2023
DOI: 10.5694/MJA2.51926
Publisher: Informa UK Limited
Date: 30-10-2023
Publisher: Center for Open Science
Date: 17-09-2019
Abstract: The primary goal of research is to advance knowledge. For that knowledge to benefit research and society, it must be trustworthy. Trustworthy research is robust, rigorous and transparent at all stages of design, execution and reporting. Initiatives such as the San Francisco Declaration on Research Assessment (DORA) and the Leiden Manifesto have led the way bringing much needed global attention to the importance of taking a considered, transparent and broad approach to assessing research quality. Since publication in 2012 the DORA principles have been signed up to by over 1500 organizations and nearly 15,000 in iduals. Despite this significant progress, assessment of researchers still rarely includes considerations related to trustworthiness, rigor and transparency. We have developed the Hong Kong Principles (HKPs) as part of the 6th World Conference on Research Integrity with a specific focus on the need to drive research improvement through ensuring that researchers are explicitly recognized and rewarded (i.e., their careers are advanced) for behavior that leads to trustworthy research. The HKP have been developed with the idea that their implementation could assist in how researchers are assessed for career advancement with a view to strengthen research integrity. We present five principles: responsible research practices transparent reporting open science (open research) valuing a ersity of types of research and recognizing all contributions to research and scholarly activity. For each principle we provide a rationale for its inclusion and provide ex les where these principles are already being adopted.
Publisher: Public Library of Science (PLoS)
Date: 28-07-2009
Publisher: Informa UK Limited
Date: 31-10-2023
Publisher: Public Library of Science (PLoS)
Date: 23-02-2021
Publisher: Wiley
Date: 14-09-2023
DOI: 10.1002/HON.3218
Publisher: Elsevier BV
Date: 07-2006
Publisher: Public Library of Science (PLoS)
Date: 17-03-2009
Publisher: Wiley
Date: 06-08-2023
DOI: 10.5694/MJA2.52054
Publisher: Public Library of Science (PLoS)
Date: 30-11-2004
Publisher: Public Library of Science (PLoS)
Date: 28-08-2012
Publisher: Royal College of Surgeons of England
Date: 04-2016
Abstract: ‘Some editors are failed writers, but so are most writers.’ TS Eliot (1888–1965) Have you ever wondered what medical journal editors do? Most editors in the medical field are unpaid and the work is part of the wider culture of service provided by so many in the medical profession. Together with the editorial board and the publisher, an editor will decide the direction of the journal. For instance, decisions are made about what sort of material should be published. One of the most common tasks, however, is the daily screening of manuscripts submitted for publication, many of which are rejected without peer review owing to poor quality, redundant material or the subject of the article being beyond the scope of the journal. After deciding which peer reviewers to send an article to, the editor must make a final decision on a manuscript, which may not necessarily concur with the advice given by the reviewers. With this comes a huge amount of personal responsibility and one to the organisation the editor represents. Take the ex le of George Lundberg, the editor of JAMA: The Journal of the American Medical Association, who was fired from his position after 17 years with the alleged faux pas of rushing to publish an article to coincide with the Clinton impeachment hearings ‘to extract political leverage.’ Lundberg published research showing that 60% of college students surveyed in 1991 did not think that engaging in oral sex was classed as actually ‘having sex.’ 1 While neither the methods used in the survey nor the results were disputed, the timing of the publication at an awkward political juncture was. Extrapolating this, editors are therefore not just responsible for the content of what is published but also the impact of publications in the wider arena. Editors must also handle a great deal of correspondence, including author queries and complaints, and respond to them in a timely manner. Communication with the team, the publisher, authors and readers is a vital skill. Finally, the editor needs to deal with the journal’s ethical policy when ex les of plagiarism, author disputes or other forms of misconduct are evident. Breaches of publication ethics are forms of scientific misconduct that can undermine science and challenge editors, many of whom have little formal training in this field. In this respect, the Committee on Publication Ethics (COPE), founded in 1997 as a voluntary body, has become a central player. COPE provides a discussion forum and advice as well as guidelines for scientific editors with the aim of finding practical ways to deal with forms of misconduct. The Annals is a member of COPE and follows its code of conduct for journal editors. 2 It is a privilege that the current chair of COPE, Dr Barbour, and her colleagues have written this final article in the medical publishing series about challenges in publication ethics. I hope you have found this series useful and enjoyed reading the range of articles we have published from many experts in their fields. JYOTI SHAH Commissioning Editor 1. Sanders SA , Reinisch JM . Would you say you ‘had sex’ if…? JAMA 1999 281 : 275 – 277 . 2. Committee on Publication Ethics . Code of Conduct and Best Practice Guidelines for Journal Editors . Harleston, UK : COPE 2011 .
Publisher: Public Library of Science (PLoS)
Date: 27-05-2005
Publisher: Elsevier BV
Date: 08-2010
Publisher: Public Library of Science (PLoS)
Date: 31-08-2010
Publisher: Elsevier BV
Date: 04-2002
Publisher: Public Library of Science (PLoS)
Date: 22-01-2008
Publisher: Public Library of Science (PLoS)
Date: 24-06-2008
Publisher: BMJ
Date: 05-10-2022
DOI: 10.1136/BMJ.O2334
Publisher: Informa UK Limited
Date: 27-10-2023
Publisher: Informa UK Limited
Date: 30-10-2023
Publisher: Wiley
Date: 27-06-2023
DOI: 10.5694/MJA2.51998
Publisher: Elsevier BV
Date: 07-2002
Publisher: Public Library of Science (PLoS)
Date: 08-01-2008
Publisher: Public Library of Science (PLoS)
Date: 22-02-2005
Publisher: Public Library of Science (PLoS)
Date: 26-06-2007
Publisher: Elsevier BV
Date: 06-2001
Publisher: Wiley
Date: 11-09-2023
DOI: 10.1002/EHF2.14502
Publisher: Elsevier BV
Date: 11-2002
Publisher: Public Library of Science (PLoS)
Date: 26-02-2013
Publisher: Public Library of Science (PLoS)
Date: 31-07-2012
Publisher: Elsevier BV
Date: 03-2003
Publisher: Informa UK Limited
Date: 30-10-2023
Publisher: Public Library of Science (PLoS)
Date: 29-04-2008
Publisher: AMPCo
Date: 08-2019
DOI: 10.5694/MJA2.50265
Publisher: Wiley
Date: 22-08-2023
DOI: 10.1002/EJHF.2991
Publisher: Journal of Chinese Integrative Medicine Press
Date: 15-07-2010
DOI: 10.3736/JCIM20100702
Abstract: The CONSORT statement is used worldwide to improve the reporting of randomised controlled trials. Kenneth Schulz and colleagues describe the latest version, CONSORT 2010, which updates the reporting guideline based on new methodological evidence and accumulating experience. To encourage dissemination of the CONSORT 2010 Statement, this article is freely accessible on bmj.com and will also be published in the Lancet, Obstetrics and Gynecology, PLoS Medicine, Annals of Internal Medicine, Open Medicine, Journal of Clinical Epidemiology, BMC Medicine, and Trials.
Publisher: Public Library of Science (PLoS)
Date: 26-01-2010
Publisher: Public Library of Science (PLoS)
Date: 24-11-2009
Publisher: Informa UK Limited
Date: 31-10-2023
Publisher: Wiley
Date: 06-08-2023
DOI: 10.5694/MJA2.52048
Publisher: Elsevier BV
Date: 11-2003
Publisher: Public Library of Science (PLoS)
Date: 24-04-2007
Publisher: Elsevier BV
Date: 12-2002
Publisher: BMJ
Date: 13-06-2013
DOI: 10.1136/BMJ.F3601
Publisher: American College of Physicians
Date: 04-11-2008
DOI: 10.7326/0003-4819-149-9-200811040-00009
Abstract: In 2005, draft guidelines were published for reporting studies of quality improvement as the initial step in a consensus process for development of a more definitive version. The current article contains the revised version, which we refer to as Standards for QUality Improvement Reporting Excellence (SQUIRE). This narrative progress report summarizes the special features of improvement that are reflected in SQUIRE and describes major differences between SQUIRE and the initial draft guidelines. It also explains the development process, which included formulation of responses to informal feedback, written commentaries, and input from publication guideline developers ongoing review of the literature on the epistemology of improvement and methods for evaluating complex social programs and a meeting of stakeholders for critical review of the guidelines' content and wording, followed by commentary on sequential versions from an expert consultant group. Finally, the report discusses limitations of and unresolved questions about SQUIRE ancillary supporting documents and alternative versions under development and plans for dissemination, testing, and further development of SQUIRE.
Publisher: Wiley
Date: 19-03-2023
DOI: 10.5694/MJA2.51866
Publisher: Springer Science and Business Media LLC
Date: 25-10-2023
Publisher: Elsevier BV
Date: 04-2001
Publisher: Wiley
Date: 21-08-2023
DOI: 10.1111/ALL.15848
Publisher: F1000 Research Ltd
Date: 04-09-2017
DOI: 10.12688/F1000RESEARCH.12400.1
Abstract: Background: Scientific editors (i.e., those who make decisions on the content and policies of a journal) have a central role in the editorial process at biomedical journals. However, very little is known about the training needs of these editors or what competencies are required to perform effectively in this role. Methods: We conducted a survey of perceptions and training needs among scientific editors from major editorial organizations around the world, followed by a modified Delphi process in which we invited the same scientific editors to rate the importance of competency-related statements obtained from a previous scoping review. Results: A total of 148 participants completed the survey of perceptions and training needs. At least 80% of participants agreed on six of the 38 skill and expertise-related statements presented to them as being important or very important to their role as scientific editors. At least 80% agreed on three of the 38 statements as necessary skills they perceived themselves as possessing (well or very well). The top five items on participants’ list of top training needs were training in statistics, research methods, publication ethics, recruiting and dealing with peer reviewers, and indexing of journals. The three rounds of the Delphi were completed by 83, 83, and 73 participants, respectively, which ultimately produced a list of 23 “highly rated” competency-related statements and another 86 “included” items. Conclusion: Both the survey and the modified Delphi process will be critical for understanding knowledge and training gaps among scientific editors when designing curriculum around core competencies in the future.
Publisher: Wiley
Date: 08-2023
DOI: 10.1111/AOGS.14650
Publisher: Elsevier BV
Date: 02-2001
Location: United Kingdom of Great Britain and Northern Ireland
Location: United Kingdom of Great Britain and Northern Ireland
Location: United Kingdom of Great Britain and Northern Ireland
Location: Australia
Location: Netherlands
No related grants have been discovered for Virginia Barbour.