“…The different fundamental blocks which were adapted for UNet modification were (Table 4): (1) residual block [75,76,78,84,88,105,129,135,138]; (2) classifier in encoder [145]; (3) Xception block [56,88]; (4) dense layer block [68,100,102,122,142]; (5) recurrent residual block ; (6) attention block [65, 66, 71, 75, 113, 125, 128, 129, 131-135, 139, 161]; (7) dropout layer [70,76,86,95,101,102,134,138]; (8) dilated convolution [67,76]; (9) transpose convolution [66,88,95,137,139]; (10) SE network [92,103,125,133,138], and (11) squeeze and excitation block [92,103,125,133,138].…”