Abstract
Most neuroimaging studies display results that represent only a tiny fraction of the collected data. While it is conventional to present “only the significant results” to the reader, here we suggest that this practice has several negative consequences for both reproducibility and understanding. This practice hides away most of the results of the dataset and leads to problems of selection bias and irreproducibility, both of which have been recognized as major issues in neuroimaging studies recently. Opaque, all-or-nothing thresholding, even if well-intentioned, places undue influence on arbitrary filter values, hinders clear communication of scientific results, wastes data, is antithetical to good scientific practice, and leads to conceptual inconsistencies. It is also inconsistent with the properties of the acquired data and the underlying biology being studied. Instead of presenting only a few statistically significant locations and hiding away the remaining results, we propose that studies should “highlight” the former while also showing as much as possible of the rest. This is distinct from but complementary to utilizing data sharing repositories: the initial presentation of results has an enormous impact on the interpretation of a study. We present practical examples for voxelwise, regionwise and cross-study analyses using publicly available data that was analyzed previously by 70 teams (NARPS; Botvinik-Nezer, et al., 2020), showing that it is possible to balance the goals of displaying a full set of results with providing the reader reasonably concise and “digestible” findings. In particular, the highlighting approach sheds useful light on the kind of variability present among the NARPS teams’ results, which is primarily a varied strength of agreement rather than disagreement. Using a meta-analysis built on the informative “highlighting” approach shows this relative agreement, while one using the standard “hiding” approach does not. We describe how this simple but powerful change in practice---focusing on highlighting results, rather than hiding all but the strongest ones---can help address many large concerns within the field, or at least to provide more complete information about them. We include a list of practical suggestions for results reporting to improve reproducibility, cross-study comparisons and meta-analyses.
Highlights
Most studies do not present all results of their analysis, hiding subthreshold ones.
Hiding results negatively affects the interpretation and understanding of the study.
Neuroimagers should present all results of their study, highlighting key ones.
Using the public NARPS data, we show several benefits of the “highlighting” approach.
The highlighting approach improves individual studies and meta-analyses.
Competing Interest Statement
The authors have declared no competing interest.
Footnotes
Correcting one author's affiliations.