DOI: 10.58530/2022/0425
|View full text |Cite
|
Sign up to set email alerts
|

ReImagining the Young Adult Human Connectome Project (HCP) Diffusion MRI Dataset

Abstract: The Human Connectome Project (HCP) has brought significant advancements in hardware, acquisition, and preprocessing. Even after a decade since its collection, the HCP diffusion MRI data is still relevant for its richness and high resolution. Noise and geometric distortions, however, are particularly pronounced in this dataset. In this work, we have reprocessed nearly the entire HCP dMRI dataset while applying several recent processing improvements. We compared the quality of the newly processed dMRI outputs to… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
3
0

Publication Types

Select...
2
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(8 citation statements)
references
References 0 publications
0
3
0
Order By: Relevance
“…To reduce the number of tests and focus on specific white matter (WM) regions, we used a set of regions of interest (ROIs) for the current study. The ROIs were inspired by the John Hopkins University (JHU) WM ROIs; however, they were manually redrawn (by A.N., over 13 years of experience in the field of neuroanatomy and neuroimaging) on an average DT brain template built from the HCP dataset to avoid issues with left/right structural asymmetry that have been reported for the original JHU ROIs (19,25,28–30). Moreover, the JHU ROIs were defined in a scalar map and used scalar-based registration, which tend to have misregistrations (25).…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…To reduce the number of tests and focus on specific white matter (WM) regions, we used a set of regions of interest (ROIs) for the current study. The ROIs were inspired by the John Hopkins University (JHU) WM ROIs; however, they were manually redrawn (by A.N., over 13 years of experience in the field of neuroanatomy and neuroimaging) on an average DT brain template built from the HCP dataset to avoid issues with left/right structural asymmetry that have been reported for the original JHU ROIs (19,25,28–30). Moreover, the JHU ROIs were defined in a scalar map and used scalar-based registration, which tend to have misregistrations (25).…”
Section: Methodsmentioning
confidence: 99%
“…We used the TORTOISEV3 (version 3, www.tortoisedti.org) pipeline to process the dMRI data, because Irfanoglu et al had shown considerable improvement in the dMRI metrics using this pipeline compared to the released version of the HCP dataset (15,16). In the following, we briefly describe the different stages of pre-processing: (a) Denoising was performed using a model-free noise mapping technique proposed by Veraart et al, with a kernel radius of 3 (17).…”
Section: Preprocessingmentioning
confidence: 99%
See 1 more Smart Citation
“…Data underwent a standard preprocessing pipeline using Tortoise V3.1.2, 9,10 including motion, eddy current, and EPI distortion corrections performed using DrBuddy. 11 Preprocessed data were registered to the template space, defined by the HCP-DWI tensor atlas, 12 via DrTamas. 13 DrTamas is a registration tool based on tensor data rather than on image intensity.…”
Section: Data Preprocessingmentioning
confidence: 99%
“…Accordingly, we performed both a voxel-level and a ROI-level analysis. In the voxel-wise analysis, we computed the registration between the subject space and the HCP-DWI atlas 12 and moved all subject maps to the template space to investigate the estimation average and variance associated with each parameter of interest. In the ROI analysis, we performed a quantitative analysis of the signed relative error of the parameters estimates with OAS compared with the reference estimates in each ROI, as well as their linear regression.…”
Section: Framework Performance Analysismentioning
confidence: 99%