2020 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE) 2020
DOI: 10.1109/fuzz48607.2020.9177555
|View full text |Cite
|
Sign up to set email alerts
|

Additional Feature Layers from Ordered Aggregations for Deep Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
13
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(14 citation statements)
references
References 28 publications
1
13
0
Order By: Relevance
“…Regarding the selection of the operators, they manually established fixed operators. In contrast, our OWA layer proposal in [13] is also composed of several OWA operators that generate new fused feature maps, but with important differences. With respect to the usage of the new feature maps, in our proposal, the fused feature maps are used to augment the information of the network, preserving the original feature maps.…”
Section: Related Workmentioning
confidence: 99%
See 4 more Smart Citations
“…Regarding the selection of the operators, they manually established fixed operators. In contrast, our OWA layer proposal in [13] is also composed of several OWA operators that generate new fused feature maps, but with important differences. With respect to the usage of the new feature maps, in our proposal, the fused feature maps are used to augment the information of the network, preserving the original feature maps.…”
Section: Related Workmentioning
confidence: 99%
“…The OWA layer that we explore in this work was previously defined in [13,38], where we presented some preliminary results about the impact of the layer in CNNs. In the previous works, we found an increase in the accuracy of a medium-sized VGG13 network on a large image dataset, but we did not study the learned OWA operators and the way they codify information.…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations