ORCID Profile
0000-0003-3569-931X
Current Organisation
University of Oxford
Does something not look right? The information on this page has been harvested from data sources that may not be up to date. We continue to work with information providers to improve coverage and quality. To report an issue, use the Feedback Form.
Publisher: Center for Open Science
Date: 08-04-2022
Abstract: In recent years, the scientific community has called for improvements in the credibility, robustness, and reproducibility of research, characterized by increased interest and promotion of open and transparent research practices. While progress has been positive, there is a lack of consideration about how this approach can be embedded into undergraduate and postgraduate research training. Specifically, a critical overview of the literature which investigates how integrating open and reproducible science may influence student outcomes is needed. In this paper, we provide the first critical review of literature surrounding the integration of open and reproducible scholarship into teaching and learning and its associated outcomes in students. Our review highlighted how embedding open and reproducible scholarship appears to be associated with (1) students’ scientific literacies (i.e., students’ understanding of open research, consumption of science, and the development of transferable skills) (2) student engagement (i.e., motivation and engagement with learning, collaboration, and engagement in open research), and (3) students’ attitudes towards science (i.e., trust in science and confidence in research findings). However, our review also identified a need for more robust and rigorous methods within pedagogical research, including more interventional and experimental evaluations of teaching practice. We discuss implications for teaching and learning scholarship.
Publisher: Center for Open Science
Date: 14-03-2022
Abstract: Training in robust research practices is becoming increasingly common. However, many course participants may encounter challenges in implementation of what they learned after returning to their research groups. In this piece, we summarize insights and "lessons learned" from a group of former course participants. We offer practical tips on implementation and cultural change that may be useful for researchers at any career stage. In addition, we provide a list of considerations for course instructors to help them support course attendees after training is over.
Publisher: Center for Open Science
Date: 04-2022
Abstract: [Preprint Manuscript accepted at Psychological Science] In April 2019, Psychological Science published its first issue in which all research articles received the Open Data badge. We used that issue to investigate the effectiveness of this badge, focusing on the adherence to its aim at Psychological Science: sharing both data and code to ensure reproducibility of results. Twelve researchers of varying experience levels attempted to reproduce the results of the empirical articles in the target issue (at least three researchers per article). We found that while all 14 articles provided at least some data and six provided analysis code, only one article was rated to be exactly reproducible, and three essentially reproducible with minor deviations. We suggest that researchers should be encouraged to adhere to the higher standard in force at Psychological Science. Moreover, a check of reproducibility during peer review may be preferable to the ‘disclosure method’ of awarding badges.
Publisher: SAGE Publications
Date: 02-02-2023
DOI: 10.1177/09567976221140828
Abstract: In April 2019, Psychological Science published its first issue in which all Research Articles received the Open Data badge. We used that issue to investigate the effectiveness of this badge, focusing on the adherence to its aim at Psychological Science: sharing both data and code to ensure reproducibility of results. Twelve researchers of varying experience levels attempted to reproduce the results of the empirical articles in the target issue (at least three researchers per article). We found that all 14 articles provided at least some data and six provided analysis code, but only one article was rated to be exactly reproducible, and three were rated as essentially reproducible with minor deviations. We suggest that researchers should be encouraged to adhere to the higher standard in force at Psychological Science. Moreover, a check of reproducibility during peer review may be preferable to the disclosure method of awarding badges.
Publisher: The Royal Society
Date: 05-2023
DOI: 10.1098/RSOS.221255
Abstract: In recent years, the scientific community has called for improvements in the credibility, robustness and reproducibility of research, characterized by increased interest and promotion of open and transparent research practices. While progress has been positive, there is a lack of consideration about how this approach can be embedded into undergraduate and postgraduate research training. Specifically, a critical overview of the literature which investigates how integrating open and reproducible science may influence student outcomes is needed. In this paper, we provide the first critical review of literature surrounding the integration of open and reproducible scholarship into teaching and learning and its associated outcomes in students. Our review highlighted how embedding open and reproducible scholarship appears to be associated with (i) students' scientific literacies (i.e. students’ understanding of open research, consumption of science and the development of transferable skills) (ii) student engagement (i.e. motivation and engagement with learning, collaboration and engagement in open research) and (iii) students' attitudes towards science (i.e. trust in science and confidence in research findings). However, our review also identified a need for more robust and rigorous methods within pedagogical research, including more interventional and experimental evaluations of teaching practice. We discuss implications for teaching and learning scholarship.
Publisher: Public Library of Science (PLoS)
Date: 05-01-2023
DOI: 10.1371/JOURNAL.PCBI.1010750
Abstract: Open, reproducible, and replicable research practices are a fundamental part of science. Training is often organized on a grassroots level, offered by early career researchers, for early career researchers. Buffet style courses that cover many topics can inspire participants to try new things however, they can also be overwhelming. Participants who want to implement new practices may not know where to start once they return to their research team. We describe ten simple rules to guide participants of relevant training courses in implementing robust research practices in their own projects, once they return to their research group. This includes (1) prioritizing and planning which practices to implement, which involves obtaining support and convincing others involved in the research project of the added value of implementing new practices (2) managing problems that arise during implementation and (3) making reproducible research and open science practices an integral part of a future research career. We also outline strategies that course organizers can use to prepare participants for implementation and support them during this process.
Publisher: Linnaeus University
Date: 10-07-2023
Abstract: Most of the commonly used and endorsed guidelines for systematic review protocols and reporting standards have been developed for intervention research. These excellent guidelines have been adopted as the gold-standard for systematic reviews as an evidence synthesis method. In the current paper, we highlight some issues that may arise from adopting these guidelines beyond intervention designs, including in basic behavioural, cognitive, experimental, and exploratory research. We have adapted and built upon the existing guidelines to establish a complementary, comprehensive, and accessible tool for designing, conducting, and reporting Non-Intervention, Reproducible, and Open Systematic Reviews (NIRO-SR). NIRO-SR is a checklist composed of two parts that provide itemised guidance on the preparation of a systematic review protocol for pre-registration (Part A) and reporting the review (Part B) in a reproducible and transparent manner. This paper, the tool, and an open repository (osf.io/f3brw) provide a comprehensive resource for those who aim to conduct a high quality, reproducible, and transparent systematic review of non-intervention studies.
Publisher: Springer Science and Business Media LLC
Date: 21-02-2022
Location: United Kingdom of Great Britain and Northern Ireland
No related grants have been discovered for Mirela Zaneva.