2007
DOI: 10.21236/ada479429
|View full text |Cite
|
Sign up to set email alerts
|

Cautious Inference in Collective Classification

Abstract: Collective classification can significantly improve accuracy by exploiting relationships among instances. Although several collective inference procedures have been reported, they have not been thoroughly evaluated for their commonalities and differences. We introduce novel generalizations of three existing algorithms that allow such algorithmic and empirical comparisons. Our generalizations permit us to examine how cautiously or aggressively each algorithm exploits intermediate relational data, which can be n… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

3
144
0

Year Published

2008
2008
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 88 publications
(147 citation statements)
references
References 10 publications
3
144
0
Order By: Relevance
“…There is some evidence to suggest that ICA is fairly robust to a number of simple ordering strategies such as random ordering, visiting nodes in ascending order of diversity of its neighborhood class labels and labeling nodes in descending order Reference local classifier used Neville & Jensen [44] naïve Bayes Lu & Getoor [35] logistic regression Jensen, Neville, & Gallagher [25] naïve Bayes, decision trees Macskassy & Provost [36] naïve Bayes, logistic regression, weighted-vote relational neighbor, class distribution relational neighbor McDowell, Gupta, & Aha [39] naïve Bayes, k-nearest neighbors of label confidences [18]. However, there is also some evidence that certain modifications to the basic ICA procedure tend to produce improved classification accuracies.…”
Section: Algorithm 2 Gibbs Sampling Algorithm (Gs)mentioning
confidence: 99%
See 1 more Smart Citation
“…There is some evidence to suggest that ICA is fairly robust to a number of simple ordering strategies such as random ordering, visiting nodes in ascending order of diversity of its neighborhood class labels and labeling nodes in descending order Reference local classifier used Neville & Jensen [44] naïve Bayes Lu & Getoor [35] logistic regression Jensen, Neville, & Gallagher [25] naïve Bayes, decision trees Macskassy & Provost [36] naïve Bayes, logistic regression, weighted-vote relational neighbor, class distribution relational neighbor McDowell, Gupta, & Aha [39] naïve Bayes, k-nearest neighbors of label confidences [18]. However, there is also some evidence that certain modifications to the basic ICA procedure tend to produce improved classification accuracies.…”
Section: Algorithm 2 Gibbs Sampling Algorithm (Gs)mentioning
confidence: 99%
“…operators PRMs, Friedman et al [12] mode [35] mode, count, exists Macskassy & Provost [36] prop Gupta, Diwan, & Sarawagi [21] mode, count McDowell, Gupta, & Aha [39] prop Table 2: A list of systems and the aggregation operators they use to aggregate neighborhood class labels. The systems include probabilistic relational models (PRMs), relational Markov networks (RMNs) and Markov logic networks (MLNs).…”
Section: Approximate Inference Algorithms For Approaches Based On Glomentioning
confidence: 99%
“…Relational Aggreators [30] Mean, Median, Mode, Count Set Operators [28] Union, Intersection, Multiset Subgraph Pattern [34] k-star, k-clique, k-motif Dimensionality Reduction [40] SVD, NMF, PCA Similarity [6] Cosine Similarity, Mutual Information Paths/walks [25] random-walks, k-walks Text Analysis [10] [36] Latent Ditichlet Allocation (LDA), Probabilistic latent semantic analysis…”
Section: Operators Examplesmentioning
confidence: 99%
“…Nodes of unknown label are then given the label of greatest probability. Unlike RL methods, Iterative Classification (IC) approaches [21,24,25] assign, at every iteration, a label to each node of unknown label, using a given relational classifier. To facilitate convergence, the amount of classified nodes at each iteration can be gradually increased during the process [24,25].…”
Section: Related Workmentioning
confidence: 99%
“…Unlike RL methods, Iterative Classification (IC) approaches [21,24,25] assign, at every iteration, a label to each node of unknown label, using a given relational classifier. To facilitate convergence, the amount of classified nodes at each iteration can be gradually increased during the process [24,25]. Although our classification approach could also be used within an IC framework, we have found the updated label probabilities of RL to have better convergence properties.…”
Section: Related Workmentioning
confidence: 99%