A1 Refereed original research article in a scientific journal

Examining the generalizability of research findings from archival data




AuthorsDelios Andrew, Clemente Elena Giulia, Tao Wu, Tan Hongbin, Wang Yong, Gordon Michael, Viganolag Domenico, Chena Zhaowei, Dreberb Anna, Johnnesson Magnus, Pfeiffer Thomas, Generalizability Tests Forecasting Collaboration, Uhlmann Eric Luis

PublisherNational Academy of Science

Publication year2022

JournalProceedings of the National Academy of Sciences of the United States of America

Volume119

Issue30

DOIhttps://doi.org/10.1073/pnas.2120377119

Web address https://www.pnas.org/doi/abs/10.1073/pnas.2120377119

Self-archived copy’s web addresshttps://research.utu.fi/converis/portal/detail/Publication/176201900

Preprint addresshttps://research.utu.fi/converis/portal/detail/Publication/176201900


Abstract

This initiative examined systematically the extent to which a large set of archival research findings generalizes across contexts. We repeated the key analyses for 29 original strategic management effects in the same context (direct reproduction) as well as in 52 novel time periods and geographies; 45% of the reproductions returned results matching the original reports together with 55% of tests in different spans of years and 40% of tests in novel geographies. Some original findings were associated with multiple new tests. Reproducibility was the best predictor of generalizability—for the findings that proved directly reproducible, 84% emerged in other available time periods and 57% emerged in other geographies. Overall, only limited empirical evidence emerged for context sensitivity. In a forecasting survey, independent scientists were able to anticipate which effects would find support in tests in new samples.


Downloadable publication

This is an electronic reprint of the original article.
This reprint may differ from the original in pagination and typographic detail. Please cite the original version.





Last updated on 2024-26-11 at 21:26