2018
DOI: 10.3390/app8101924
|View full text |Cite
|
Sign up to set email alerts
|

Temporal Action Detection in Untrimmed Videos from Fine to Coarse Granularity

Abstract: Temporal action detection in long, untrimmed videos is an important yet challenging task that requires not only recognizing the categories of actions in videos, but also localizing the start and end times of each action. Recent years, artificial neural networks, such as Convolutional Neural Network (CNN) and Long Short-Term Memory (LSTM) improve the performance significantly in various computer vision tasks, including action detection. In this paper, we make the most of different granular classifiers and propo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(2 citation statements)
references
References 40 publications
0
2
0
Order By: Relevance
“…Recent researches in dense video captioning [9,10] worked on avoiding the high computational complexity drawbacks of sliding windows. They followed an end to end event proposal generation model that produces confidence scores of multiple event proposals with multiple time scales at each time step of the video.…”
Section: Related Workmentioning
confidence: 99%
“…Recent researches in dense video captioning [9,10] worked on avoiding the high computational complexity drawbacks of sliding windows. They followed an end to end event proposal generation model that produces confidence scores of multiple event proposals with multiple time scales at each time step of the video.…”
Section: Related Workmentioning
confidence: 99%
“…In the extensive field of video understanding, the problem of Temporal Action Detection (TAD) in untrimmed videos has gained massive interest in the recent years, as evidenced by the great amount of work that has been proposed (e.g., [1][2][3][4][5][6][7][8][9]). For a recent survey on the topic, we refer the reader to [10].…”
Section: Introductionmentioning
confidence: 99%