1995 International Conference on Acoustics, Speech, and Signal Processing
DOI: 10.1109/icassp.1995.480071
|View full text |Cite
|
Sign up to set email alerts
|

Graph-theoretical analysis of the fractal transform

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
12
0

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 15 publications
(15 citation statements)
references
References 4 publications
0
12
0
Order By: Relevance
“…Domaszewicz and Vaishampayan (1995) introduced the notion of a dependence graph. The graph reflects how domain blocks are assigned to range blocks.…”
Section: Existing Fractal Recognition Methods With Divisionmentioning
confidence: 99%
See 2 more Smart Citations
“…Domaszewicz and Vaishampayan (1995) introduced the notion of a dependence graph. The graph reflects how domain blocks are assigned to range blocks.…”
Section: Existing Fractal Recognition Methods With Divisionmentioning
confidence: 99%
“…Moreover, we introduce a new fractal recognition method which will be used in the recognition of 2D shapes. As fractal features we used a dependence graph (Domaszewicz and Vaishampayan, 1995) achieved from a Partitioned Iterated Function System (PIFS) (Fisher, 1995).…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Of these measures, the most important one exploits the dependence graph [18] of the fractal code for a fast computation of the cost function.…”
Section: Introductionmentioning
confidence: 99%
“…The adaptive partition [68] and the hybrid compression algorithm exhibit, relatively, the high compression ratio for image [68] and the video conference sequences [67]. In conclusion, a fractal image codec performs better in terms of very fast decoding process as well as the promise of potentially good compression [69][70][71][72][73]. But at present, the fractal codec is not standardized because of its huge calculation amount and slow coding speed.…”
Section: Related Work On Motion Estimationmentioning
confidence: 99%