2016
DOI: 10.4103/2228-7477.186887
|View full text |Cite|
|
Sign up to set email alerts
|

Extraction of the best frames in coronary angiograms for diagnosis and analysis

Abstract: X-ray coronary angiography has been a gold standard in the clinical diagnosis and interventional treatment of coronary arterial diseases for decades. In angiography, a sequence of images is obtained, a few of which are suitable for physician inspection. This paper proposes an automatic algorithm for the extraction of one or more frames from an angiogram sequence, which is most suitable for diagnosis and analysis by experts or processors. The algorithm consists of two stages: In the first stage, the background … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
4
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 12 publications
0
4
0
Order By: Relevance
“…Currently, the selection of the end-diastolic frame is performed either manually or automatically based on the simultaneously acquired ECG signal [5]. The ECG signal may not always be available, and the ECGbased cardiac phase detection has several drawbacks: the signal-to-noise ratio may be too low to accurately detect end-diastole or the signal may present artefacts [11], [12]. Herein we report, to the best of our knowledge, the first deep learning based workflow for purely image-based cardiac phase classification of angiographic frames validated on a large, real-world dataset.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…Currently, the selection of the end-diastolic frame is performed either manually or automatically based on the simultaneously acquired ECG signal [5]. The ECG signal may not always be available, and the ECGbased cardiac phase detection has several drawbacks: the signal-to-noise ratio may be too low to accurately detect end-diastole or the signal may present artefacts [11], [12]. Herein we report, to the best of our knowledge, the first deep learning based workflow for purely image-based cardiac phase classification of angiographic frames validated on a large, real-world dataset.…”
Section: Discussionmentioning
confidence: 99%
“…Currently, the selection of the EDF and the identification of the cardiac phase are performed either manually or automatically based on simultaneously acquired ECG signals [5]. This has a number of drawbacks: ECG signals may not always be available, and cardiac phase detection based on ECG can be challenging if the signal-to-noise ratio is too low to accurately detect end-diastole or the signal presents artefacts [11], [12]. Methods for automated cardiac phase detection on medical images, without the need for processing ECG signals, have been described in the past for use cases involving cardiac echocardiographic images [13][14][15][16][17], cardiac angiographic images [18], cardiac MRI [19][20], and intravascular ultrasound [21].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…The original article entitled “Extraction of the Best Frames in Coronary Angiograms for Diagnosis and Analysis, published in Journal of Medical Signal and Sensors, on pages 150-157, Issue 3, Volume 6, 2016,[ 1 ] has a number of unattributed sections of content with high rate of similarity, with an article titled “Local feature fitting active contour for segmenting vessels in angiograms”, published in IET Computer Vision, on pages 161-170, Issue 3, Volume 8, 2014. [ 2 ] Plagiarism, unethical publication or redundant publication violates the editorial policy of Journal of Medical Signal and Sensors, which follows best practice guidelines given by the International Committee of Medical Journal Editors (ICMJE) and Committee on Publication Ethics (COPE) mentioned on the Information for Authors and as codified in the signed statements made by the authors regarding the copyright of their work.…”
mentioning
confidence: 99%