2019
DOI: 10.1002/mp.13490
|View full text |Cite
|
Sign up to set email alerts
|

A deep learning method for prediction of three‐dimensional dose distribution of helical tomotherapy

Abstract: Purpose To develop a deep learning method for prediction of three‐dimensional (3D) voxel‐by‐voxel dose distributions of helical tomotherapy (HT). Methods Using previously treated HT plans as training data, a deep learning model named U‐ResNet‐D was trained to predict a 3D dose distribution. First, the contoured structures and dose volumes were converted from plan database to 3D matrix with a program based on a developed visualization toolkit (VTK), then transferred to U‐ResNet‐D for correlating anatomical feat… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
86
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 83 publications
(86 citation statements)
references
References 47 publications
0
86
0
Order By: Relevance
“…There have been some researches on 3D dose distribution prediction using deep learning 43‐47. Nguyen et al employed Hierarchically Densely Connected U‐Net model to implement dose prediction for IMRT treatment plan of the prostate, and the results showed that the averaged D max and D mean of dose differences for all contoured structures were within 5.1% of the prescription dose and the average DSC between the predicted and clinical truth was 0.91 43.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…There have been some researches on 3D dose distribution prediction using deep learning 43‐47. Nguyen et al employed Hierarchically Densely Connected U‐Net model to implement dose prediction for IMRT treatment plan of the prostate, and the results showed that the averaged D max and D mean of dose differences for all contoured structures were within 5.1% of the prescription dose and the average DSC between the predicted and clinical truth was 0.91 43.…”
Section: Discussionmentioning
confidence: 99%
“…Nguyen et al employed Hierarchically Densely Connected U‐Net model to implement dose prediction for IMRT treatment plan of the prostate, and the results showed that the averaged D max and D mean of dose differences for all contoured structures were within 5.1% of the prescription dose and the average DSC between the predicted and clinical truth was 0.91 43. Liu et al used 2D residual network to achieve dose prediction for helical tomotherapy of nasopharyngeal cases, which reported the mean absolute differences of D max and D mean for OARs were within 4.2% and 2.4%, respectively, and averaged 3D dose prediction bias ranged from 2.0% to 2.3% 44. It is difficult to directly compare the CNN models developed by us with those by these researchers, since different patient databases and treatment modality were used.…”
Section: Discussionmentioning
confidence: 99%
“…[19][20][21][22] A DL-based algorithm with convolutional neural networks can automatically extract complex features by backpropagation to output appropriate results, eliminating the need for manual feature extraction. 23,24 We hypothesized that a DL-based method could generate realistic VNC images that suppress the contrast material signal while maintaining the CT numbers of bones. Here, we describe a fully automated method that uses DL to generate more realistic VNC images (VNC DL ) than VNC DECT images to improve the accuracy of treatment planning.…”
Section: Introductionmentioning
confidence: 99%
“…In recent years, a number of deep learning (DL)-based ATP techniques have been proposed using various DL neural networks (18)(19)(20)(21)(22)(23)(24)(25)(26)(27)(28)(29)(30)(31)(32)(33). Several review articles on AI in radiation oncology (34)(35)(36), and radiotherapy treatment planning (37)(38)(39), have been published, which demonstrated the interests on AI and the significance of ATP, summarization of the achievements and challenges, as well as insightful discussion on future studies.…”
Section: Introductionmentioning
confidence: 99%