2018
DOI: 10.1016/j.jocs.2018.07.003
|View full text |Cite
|
Sign up to set email alerts
|

Multiple sclerosis identification by convolutional neural network with dropout and parametric ReLU

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
80
0
1

Year Published

2018
2018
2021
2021

Publication Types

Select...
9
1

Relationship

1
9

Authors

Journals

citations
Cited by 169 publications
(81 citation statements)
references
References 38 publications
0
80
0
1
Order By: Relevance
“…The values of the mask were then determined to minimize the loss through iterative learning. In this study, as an activation function, the rectified linear unit (ReLU) function [43][44][45][46] rather than the Sigmoid function is used as shown in Figure 7. That is because a vanishing gradient (in which a gradient converges to zero) occurs if the Sigmoid function is used [47][48][49].…”
Section: Detection Of Target Regionmentioning
confidence: 99%
“…The values of the mask were then determined to minimize the loss through iterative learning. In this study, as an activation function, the rectified linear unit (ReLU) function [43][44][45][46] rather than the Sigmoid function is used as shown in Figure 7. That is because a vanishing gradient (in which a gradient converges to zero) occurs if the Sigmoid function is used [47][48][49].…”
Section: Detection Of Target Regionmentioning
confidence: 99%
“…Andere bereits bearbeitete Fragestellungen für Radiomics-Arbeiten waren die Differenzierung von MS und Erkrankungen aus dem Neuromyelitis-Optica-Spektrum [28][29][30] und die Abgrenzung von MS-Patienten von gesunden Kontrollprobanden. Zum letztgenannten Thema existieren auch auf Deep Learning beruhende Studien [31][32][33]. Eitel et al [34] untersuchten hierbei auch, welche Merkmale der Algorithmus zur Klassifikation heranzieht, und konnten so zeigen, dass neben den typischen Läsionen in geringerem Maß auch normal erscheinende Areale, wie z.…”
Section: Integration Klinischer Datenunclassified
“…By combining both parametric rectified linear unit (PReLU) and dropout techniques. Zhang et al [65] proposed an improved 10-layer convolutional neural network including 7 convolution layer and 3 fully connected layers. They collected 681 healthy control brain slices and 676 multiple sclerosis brain slices to complete their experiment.…”
Section: Related Workmentioning
confidence: 99%