2019
DOI: 10.1101/615161
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Toward A Reproducible, Scalable Framework for Processing Large Neuroimaging Datasets

Abstract: Emerging neuroimaging datasets (collected through modalities such as Electron Microscopy, Calcium Imaging, or X-ray Microtomography) describe the location and properties of neurons and their connections at unprecedented scale, promising new ways of understanding the brain. These modern imaging techniques used to interrogate the brain can quickly accumulate gigabytes to petabytes of structural brain imaging data. Unfortunately, many neuroscience laboratories lack the computational expertise or resources to work… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
3
1
1

Relationship

4
1

Authors

Journals

citations
Cited by 6 publications
(8 citation statements)
references
References 34 publications
0
8
0
Order By: Relevance
“…For the purposes of the above comparison, these graphs were generated from relatively small volumes. This algorithm can be readily adapted to batch processing frameworks (e.g., [11]) and large spatial databases (e.g., [12]).…”
Section: Experiments and Resultsmentioning
confidence: 99%
“…For the purposes of the above comparison, these graphs were generated from relatively small volumes. This algorithm can be readily adapted to batch processing frameworks (e.g., [11]) and large spatial databases (e.g., [12]).…”
Section: Experiments and Resultsmentioning
confidence: 99%
“…We designed our toolkit such that even minimal coding skills and copy-pasting of simple design patterns can be leveraged to reduce user burden. As the community continues to formalize use cases and data storage paradigms, programmatic workflows like SABER [ 11 ], LONI [ 12 ], Luigi [ 13 ], or other workflow managers [ 14 ], [ 15 ], [ 16 ] may allow for additional simplification and can directly leverage these functions. Point and click graphical interfaces may also follow.…”
Section: Methodsmentioning
confidence: 99%
“…4) Processing: Tool and algorithm developers commonly target specific data storage ecosystems in order to reduce the burden of supporting several disparate ecosystems and data-standards [23], [11]. By leveraging shock-absorber tools like intern, algorithm developers can write code once and deploy it to a variety of datastores.…”
Section: Bmentioning
confidence: 99%
“…We designed our toolkit such that even minimal coding skills and copy-pasting of simple design patterns can be leveraged to reduce user burden. As the community continues to formalize use cases and data storage paradigms, programmatic workflows like SABER [12], LONI [13], Luigi [14], or other workflow managers [15,16,17] may allow for additional simplification and can directly leverage these functions. Point and click graphical interfaces may also follow.…”
Section: Architecturementioning
confidence: 99%
“…By leveraging shock-absorber tools like intern, algorithm developers can write code once and deploy it to a variety of datastores. As a proof of concept, we adopted a synapse-detection algorithm based upon the U-net architecture [12,24]. This algorithm Service targets data downloaded from an intern VolumeResource, which means that it is trivially portable to data downloaded from any supported volumetric data storage service.…”
Section: Processingmentioning
confidence: 99%