Functional magnetic resonance imaging (fMRI) is a standard tool to investigate the neural correlates of cognition. fMRI noninvasively measures brain activity, allowing identification of patterns evoked by tasks performed during scanning. Despite the long history of this technique, the idiosyncrasies of each dataset have led to the use of ad-hoc preprocessing protocols customized for nearly every different study. This approach is time-consuming, error-prone, and unsuitable for combining datasets from many sources. Here we showcase fMRIPrep ( http://fmriprep.org ), a robust tool to prepare human fMRI data for statistical analysis. This software instrument addresses the reproducibility concerns of the established protocols for fMRI preprocessing. By leveraging the Brain Imaging Data Structure (BIDS) to standardize both the input datasets -MRI data as stored by the scanner-and the outputs -data ready for modeling and analysis-, fMRIPrep is capable of preprocessing a diversity of datasets without manual intervention. In support of the growing popularity of fMRIPrep , this protocol describes how to integrate the tool in a task-based fMRI investigation workflow.
Functional magnetic resonance imaging (fMRI) is a popular method for in vivo neuroimaging. Modern fMRI sequences are often weighted towards the blood oxygen level dependent (BOLD) signal, which is closely linked to neuronal activity (Logothetis, 2002). This weighting is achieved by tuning several parameters to increase the BOLD-weighted signal contrast. One such parameter is "TE," or echo time. TE is the amount of time elapsed between when protons are excited (the MRI signal source) and measured. Although the total measured signal magnitude decays with echo time, BOLD sensitivity increases (Silvennoinen et al., 2003). The optimal TE maximizes the BOLD signal weighting based on a number of factors, including several MRI scanner parameters (e.g., field strength), imaged tissue composition (e.g., grey vs. white matter), and proximity to air-tissue boundaries.
Brainhack is an innovative meeting format that promotes scientific collaboration and education in an open, inclusive environment. This NeuroView describes the myriad benefits for participants and the research community and how Brainhacks complement conventional formats to augment scientific progress.
The Target Article by Lee et al. (2019) highlights the ways in which ongoing concerns about research reproducibility extend to model-based approaches in cognitive science. Whereas Lee et al. focus primarily on the importance of research practices to improve model robustness, we propose that the transparent sharing of model specifications, including their inputs and outputs, is also essential to improving the reproducibility of model-based analyses. We outline an ongoing effort (within the context of the Brain Imaging Data Structure community) to develop standards for the sharing of the structure of computational models and their outputs.
The Brain Imaging Data Structure (BIDS) is a standard for organizing and describing neuroimaging datasets. It serves not only to facilitate the process of data sharing and aggregation, but also to simplify the application and development of new methods and software for working with neuroimaging data. Here, we present an extension of BIDS to include positron emission tomography (PET) data (PET-BIDS). We describe the PET-BIDS standard in detail and share several open-access datasets curated following PET-BIDS. Additionally, we highlight several tools which are already available for converting, validating and analyzing PET-BIDS datasets.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.