ORCID Profile
0000-0003-3596-5980
Current Organisation
Utrecht University
Does something not look right? The information on this page has been harvested from data sources that may not be up to date. We continue to work with information providers to improve coverage and quality. To report an issue, use the Feedback Form.
Publisher: University of California Press
Date: 2017
DOI: 10.1525/COLLABRA.102
Abstract: In this paper, we present three retrospective observational studies that investigate the relation between data sharing and statistical reporting inconsistencies. Previous research found that reluctance to share data was related to a higher prevalence of statistical errors, often in the direction of statistical significance (Wicherts, Bakker, & Molenaar, 2011). We therefore hypothesized that journal policies about data sharing and data sharing itself would reduce these inconsistencies. In Study 1, we compared the prevalence of reporting inconsistencies in two similar journals on decision making with different data sharing policies. In Study 2, we compared reporting inconsistencies in psychology articles published in PLOS journals (with a data sharing policy) and Frontiers in Psychology (without a stipulated data sharing policy). In Study 3, we looked at papers published in the journal Psychological Science to check whether papers with or without an Open Practice Badge differed in the prevalence of reporting errors. Overall, we found no relationship between data sharing and reporting inconsistencies. We did find that journal policies on data sharing seem extremely effective in promoting data sharing. We argue that open data is essential in improving the quality of psychological science, and we discuss ways to detect and reduce reporting inconsistencies in the literature.
Publisher: Elsevier BV
Date: 07-2016
Publisher: SAGE Publications
Date: 06-2015
DOI: 10.1037/GPR0000034
Abstract: Replication is often viewed as the demarcation between science and nonscience. However, contrary to the commonly held view, we show that in the current (selective) publication system replications may increase bias in effect size estimates. Specifically, we examine the effect of replication on bias in estimated population effect size as a function of publication bias and the studies’ s le size or power. We analytically show that incorporating the results of published replication studies will in general not lead to less bias in the estimated population effect size. We therefore conclude that mere replication will not solve the problem of overestimation of effect sizes. We will discuss the implications of our findings for interpreting results of published and unpublished studies, and for conducting and interpreting results of meta-analyses. We also discuss solutions for the problem of overestimation of effect sizes, such as discarding and not publishing small studies with low power, and implementing practices that completely eliminate publication bias (e.g., study registration).
Publisher: Public Library of Science (PLoS)
Date: 31-07-2020
Publisher: Public Library of Science (PLoS)
Date: 15-03-2017
Publisher: Public Library of Science (PLoS)
Date: 29-09-2016
Publisher: Elsevier BV
Date: 10-2020
Publisher: Wiley
Date: 03-06-2023
DOI: 10.1002/ICD.2438
Abstract: Many of us, especially in the more experimental tradition, are used to doing it all ourselves: we don't know better than to collect our own data for every new study. We invest considerable amounts of time and resources in designing experiments, ethical review applications, participant recruitment, finding and training measuring assistants, and writing data management plans. And after all the hard work we end up with a s le size that hardly ever meets current standards. What if we could skip, or outsource some of these study aspects, allowing us to devote more of our time, energy, expertise, and experience to data analysis, reading, writing, and theory development? We argue that sharing data, expertise, and infrastructure could contribute to improving the credibility of research results, and to practicing more sustainable developmental science. In addition, we discuss several reasons that we believe may (currently) dissuade researchers from considering such sharing.
Publisher: Frontiers Media SA
Date: 25-11-2016
Publisher: Springer Science and Business Media LLC
Date: 28-03-2015
DOI: 10.1007/S11336-015-9444-2
Abstract: We respond to the commentaries Waldman and Lilienfeld (Psychometrika, 2015) and Wigboldus and Dotch (Psychometrika, 2015) provided in response to Sijtsma's (Sijtsma in Psychometrika, 2015) discussion article on questionable research practices. Specifically, we discuss the fear of an increased dichotomy between substantive and statistical aspects of research that may arise when the latter aspects are laid entirely in the hands of a statistician, remedies for false positives and replication failure, and the status of data exploration, and we provide a re-definition of the concept of questionable research practices.
Publisher: Informa UK Limited
Date: 19-01-2017
Publisher: Public Library of Science (PLoS)
Date: 10-12-2014
Location: Netherlands
Location: Netherlands
No related grants have been discovered for Coosje Lisabet Sterre Veldkamp.