2019 IEEE International Conference on Big Data (Big Data) 2019
DOI: 10.1109/bigdata47090.2019.9006467
|View full text |Cite
|
Sign up to set email alerts
|

Exascale Deep Learning to Accelerate Cancer Research

Abstract: Deep learning, through the use of neural networks, has demonstrated remarkable ability to automate many routine tasks when presented with sufficient data for training. The neural network architecture (e.g. number of layers, types of layers, connections between layers, etc.) plays a critical role in determining what, if anything, the neural network is able to learn from the training data. The trend for neural network architectures, especially those trained on ImageNet, has been to grow ever deeper and more comp… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 17 publications
(2 citation statements)
references
References 29 publications
0
2
0
Order By: Relevance
“…MENNDL's built-in training termination leverages truncated training in conjunction with a dynamic early termination criterion. It ends training at 20 epochs, or earlier if loss is stable over the past 10 epochs, making this NAS implementation one of the most effective on HPC systems [1], [4]. We put PENGUIN to the test by comparing the actual walltime of training our 6,000 NNs using MENNDL with two termination scenarios: (i) with MENNDL's built-in training termination; and (ii) with PENGUIN augmenting the termination decision.…”
Section: Walltime Speedupmentioning
confidence: 99%
See 1 more Smart Citation
“…MENNDL's built-in training termination leverages truncated training in conjunction with a dynamic early termination criterion. It ends training at 20 epochs, or earlier if loss is stable over the past 10 epochs, making this NAS implementation one of the most effective on HPC systems [1], [4]. We put PENGUIN to the test by comparing the actual walltime of training our 6,000 NNs using MENNDL with two termination scenarios: (i) with MENNDL's built-in training termination; and (ii) with PENGUIN augmenting the termination decision.…”
Section: Walltime Speedupmentioning
confidence: 99%
“…N EURAL networks (NN) are powerful models that are increasingly used in traditional high-performance computing (HPC) scientific simulations and new research areas, such as high-performance artificial intelligence and high-throughput data analytics, to solve problems in physics [1], materials science [2], neuroscience [3], and medical imaging [4] among other domains. Finding suitable NNs is a time-consuming process involving several rounds of hyperparameter selection, training, validation, and manual inspection.…”
Section: Introductionmentioning
confidence: 99%