Proceedings of the 2000 ACM SIGMOD International Conference on Management of Data 2000
DOI: 10.1145/342009.335388
|View full text |Cite
|
Sign up to set email alerts
|

Lof

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
252
0
3

Year Published

2009
2009
2021
2021

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 2,842 publications
(333 citation statements)
references
References 8 publications
0
252
0
3
Order By: Relevance
“…Nonparametric methods include, for instance, the well-known "distance-based" methods [Knorr and Ng 1997b;Ramaswamy et al 2000;Angiulli and Pizzuti 2002] and also "density-based" methods such as LOF (local outlier factor) [Breunig et al 2000] and its many variants. An important categorization of these approaches distinguishes "local" versus "global" approaches.…”
Section: Global and Local Outlier Detectionmentioning
confidence: 99%
“…Nonparametric methods include, for instance, the well-known "distance-based" methods [Knorr and Ng 1997b;Ramaswamy et al 2000;Angiulli and Pizzuti 2002] and also "density-based" methods such as LOF (local outlier factor) [Breunig et al 2000] and its many variants. An important categorization of these approaches distinguishes "local" versus "global" approaches.…”
Section: Global and Local Outlier Detectionmentioning
confidence: 99%
“…We use the Local Outlier Factor (LOF) test [5]. It requires no knowledge of the underlying probability density function.…”
Section: Outlier Detection and Clusteringmentioning
confidence: 99%
“…Therefore, the infrastructural features cannot be used, as they would be sufficient on their own to discriminate between the individual building classes within the training samples. For the outlier detection, the number of buildings forming a cluster has been set to ten, as suggested in [5]. All houses with an LOF three times greater than the mean outlier factor are deleted from the dataset.…”
Section: Identifying Typical Buildingsmentioning
confidence: 99%
See 1 more Smart Citation
“…In the descriptors (input) space, one can use as AD a threshold for the distance between a test object and specially selected (as in the case of the k-center method [27] ) or the closest training set objects, as it is realized in different version of kNN approach. [6,28,29] One can also define a kernel-representing (feature) space, onto which the input space is implicitly mapped. [11] In this case, the boundary between high and low density regions is associated with the support vectors.…”
Section: One-class Classification and Probability Density Estimationmentioning
confidence: 99%