2023
DOI: 10.1109/tgrs.2023.3297203
|View full text |Cite
|
Sign up to set email alerts
|

MiCro: Modeling Cross-Image Semantic Relationship Dependencies for Class-Incremental Semantic Segmentation in Remote Sensing Images

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 56 publications
0
3
0
Order By: Relevance
“…3, the first category, known as data-replay methods, involves storing a portion of past training data as exemplar memory such as [26], [27], [28], [29], [30], [31], [32], [33], [34], [35], [36]. The second category, termed datafree methods, includes methods like [37], [38], [39], [40], [41], [42], [43], [44], [45], [46], [47], [48], [49]. These methods utilize transfer learning techniques, such as knowledge distillation, to inherit the capabilities of the old model.…”
Section: Semantic Driftmentioning
confidence: 99%
See 2 more Smart Citations
“…3, the first category, known as data-replay methods, involves storing a portion of past training data as exemplar memory such as [26], [27], [28], [29], [30], [31], [32], [33], [34], [35], [36]. The second category, termed datafree methods, includes methods like [37], [38], [39], [40], [41], [42], [43], [44], [45], [46], [47], [48], [49]. These methods utilize transfer learning techniques, such as knowledge distillation, to inherit the capabilities of the old model.…”
Section: Semantic Driftmentioning
confidence: 99%
“…Qiu et al [48] use self-attention to capture both within-class and between-class knowledge. Current methods continually explore the indepth distillation manners from class weighted [42], [49], dense intermediate-features alignment [40], [44], class-wise enhancement [147], [148], cross-image relationship modeling [45], prototype rehearsal [27], [53], [101] and cross-scene modeling [149], etc. With respect to the network architecture, some research proves that a stronger backbone is able to improve the distillation performance such as Transformers [49], [150].…”
Section: Regularization-based Mannermentioning
confidence: 99%
See 1 more Smart Citation