2022
DOI: 10.1007/978-3-031-20047-2_10
|View full text |Cite
|
Sign up to set email alerts
|

Disentangling Architecture and Training for Optical Flow

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 17 publications
(11 citation statements)
references
References 53 publications
0
7
0
Order By: Relevance
“…LiteFlowNet2 [HTL20] is already an optimized version of FlowNet 2.0 [IMS * 17], in comparison PWC‐Net [SYLK18] has more potential for optimization/compression. Moreover, recently it has been shown that PWC‐Net can achieve similar accuracy to RAFT when trained on a large‐scale synthetic dataset [SVH * 21] and that PWC‐Net achieves favorable trade‐offs vs. other state‐of‐the‐art methods when selecting for runtime performance or higher image resolutions [SHR * 22]. Hence, we select PWC‐Net for further compression.…”
Section: Methodsmentioning
confidence: 99%
“…LiteFlowNet2 [HTL20] is already an optimized version of FlowNet 2.0 [IMS * 17], in comparison PWC‐Net [SYLK18] has more potential for optimization/compression. Moreover, recently it has been shown that PWC‐Net can achieve similar accuracy to RAFT when trained on a large‐scale synthetic dataset [SVH * 21] and that PWC‐Net achieves favorable trade‐offs vs. other state‐of‐the‐art methods when selecting for runtime performance or higher image resolutions [SHR * 22]. Hence, we select PWC‐Net for further compression.…”
Section: Methodsmentioning
confidence: 99%
“…Optical Flow Estimation: Several deep architectures have been proposed for optical flow [4,8,23,29,30,38]. Among these, Recurrent All Pairs Field Transforms (RAFT) [30] have shown significant performance improvement over previous methods, inspiring many subsequent works [6,14,26,27,35]. Following the structure of RAFT architecture, complementary studies [12,14,33,35,39] proposed advancements on feature extraction, 4D correlation volume, recurrent update blocks, and more recently, transformer extensions [6,39].…”
Section: Related Workmentioning
confidence: 99%
“…Though its performance gain is notable, Aut-oFlow employs synthetic augmentations. The work in [27] utilizes AutoFlow and argues that it is important to disentangle architecture and training pipeline. [27] also points out that some of the performance improvements of the recent methods are due to hyperparameters, dataset extensions, and training optimizations.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations