2021
DOI: 10.1016/j.eswa.2021.115433
|View full text |Cite
|
Sign up to set email alerts
|

SLSNet: Skin lesion segmentation using a lightweight generative adversarial network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
28
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
3

Relationship

1
8

Authors

Journals

citations
Cited by 48 publications
(32 citation statements)
references
References 21 publications
0
28
0
Order By: Relevance
“…Table 3 shows various performance measures found in this survey research. In cases, Brain: Retina: [39][40][41][42][43][44][45][46][47][48][49][50][51][52][53][54] Microscopic: [55][56][57][58][59][60][61][62][63][64] Multi-Organs: [65][66][67][68][69][70][71] Liver: [72][73][74][75][76][77] Skin: [78][79][80][81][82][83] Orthopedic: [84][85][86][87] Cardiac: [88]…”
Section: Performance Metricsmentioning
confidence: 99%
See 1 more Smart Citation
“…Table 3 shows various performance measures found in this survey research. In cases, Brain: Retina: [39][40][41][42][43][44][45][46][47][48][49][50][51][52][53][54] Microscopic: [55][56][57][58][59][60][61][62][63][64] Multi-Organs: [65][66][67][68][69][70][71] Liver: [72][73][74][75][76][77] Skin: [78][79][80][81][82][83] Orthopedic: [84][85][86][87] Cardiac: [88]…”
Section: Performance Metricsmentioning
confidence: 99%
“…We have also noticed that l mse function is also utilized in 6 different studies followed by l mae , the function being used in 2 studies only. [20,22,44,45,48,51,58,62,72,76,79,84,85], [86, 92, 94, 96-98, 117, 125, 150, 151] The binary cross-entropy loss function evaluates each projected probability against the actual class outcome, which is either 0 or 1 L4 l dice [27, 51, 64, 66, 67, 90, 94, 95, 117, 120, 122][130, 141, 144, 145, 149] The term dice loss comes from the Sørensen-Dice coefficient, a statistic created in the 1940s to determine the similarity of two samples L5 l cyc [106]. [140] The element-wise loss is employed to ensure self-similarity during cyclic transformation when an unaligned training pair is provided L6 l r p [22] Regional perceptual loss function L7 l image [91] The element-wise data fidelity loss in the image domain is used to validate structure similarity to the target when an aligned training pair is provided [101] Perceptual loss function L9 l tvl1 [101] The loss function is based on TVL1 algorithm L10 l al [89] The multiple paths of loss are penalized differently in an asymmetric loss function L11 l acc [89] The function is based on achieved accuracy from the confusion matrix L12 l f l [77] The focal loss function is utilized to address class imbalance issues during the network training L13 l mse [48,66,68,69,81,1...…”
Section: Loss Functionsmentioning
confidence: 99%
“…It would also be interesting to explore the segmentation of lungs in the healthy patients using the AI model trained on COVID-19 patients. Other neural network techniques such as generative adversarial networks (GANs) [80] or transfer learning and loss schemes [38,44,81] can also be adapted. A big data framework can be used to integrate comorbidity factors [82] in the AI models.…”
Section: Strengths Weakness and Extensionsmentioning
confidence: 99%
“…Dosovitskiy et al removed the CNN-based encoder and replaced it with the vision transformer to improve the image recognition performance of the network [23]. Sarker et al proposed a SLSNet network that reduces the computational cost by using 1-D core factor deep learning networks for sensitive skin lesion segmentation with minimal resources [24]. Wu et al [25] proposed a new feature adaptive transformer network based on the classical encoder-decoder architecture called FAT-Net.…”
Section: Introductionmentioning
confidence: 99%