2004
DOI: 10.1590/s0104-65002004000100006
|View full text |Cite
|
Sign up to set email alerts
|

The Douglas-peucker algorithm: sufficiency conditions for non-self-intersections

Abstract: The classic Douglas-Peucker line-simplification algorithm is recognized as the one that delivers the best perceptual representations of the original lines. It may, however, produce simplified polyline that is not topologically equivalent to the original one consisting of all vertex samples. On the basis of properties of the polyline hulls, Saalfeld devised a simple rule for detecting topological inconsistencies and proposed to solve them by carrying additional refinements. In this paper, we present an alternat… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
12
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 26 publications
(12 citation statements)
references
References 6 publications
0
12
0
Order By: Relevance
“…A polyline is created using the input dataset coordinates as polyline vertexes, from which a tolerance distance or an idealized number of points can be predefined. The algorithm strategy recursively creates new segments approximating the original polyline, until all vertices of the polyline satisfy the predefined tolerance condition [39]. Both sampling design approaches were further evaluated to assess kriging interpolation accuracy metrics using an external validation subset as further detailed in the next section.…”
Section: Em38-mk2 Data Filtering and External Validationmentioning
confidence: 99%
“…A polyline is created using the input dataset coordinates as polyline vertexes, from which a tolerance distance or an idealized number of points can be predefined. The algorithm strategy recursively creates new segments approximating the original polyline, until all vertices of the polyline satisfy the predefined tolerance condition [39]. Both sampling design approaches were further evaluated to assess kriging interpolation accuracy metrics using an external validation subset as further detailed in the next section.…”
Section: Em38-mk2 Data Filtering and External Validationmentioning
confidence: 99%
“…It reduces the number of curve in an image and transform it to a straight line between two points. As a result, the contour of object can be identified easier [29]. If the number of contours is 4, then the object is classified as a rectangle object.…”
Section: Shape Segmentationmentioning
confidence: 99%
“…After the prepossessing of the eye image, the visible eyeball area is considered as an ellipse. To identify the outermost border of this ellipse border following an algorithm by Suzuki and Abe [45] is used and then Douglas-Peucker approximation algorithm [46] is utilized to reduce the number of points in the curve. Then, the center (x,ȳ) of the ellipse is estimated using moments [47] and is used to indicate the center of the eye.…”
Section: ) Estimation Of Gaze Directionmentioning
confidence: 99%