2017
DOI: 10.1007/978-3-319-57454-7_7
|View full text |Cite
|
Sign up to set email alerts
|

A Deep Neural Network for Pairwise Classification: Enabling Feature Conjunctions and Ensuring Symmetry

Abstract: Abstract. Pairwise classification is a computational problem to determine whether a given ordered pair of objects satisfies a binary relation R which is specified implicitly by a set of training data used for 'learning' R. It is an important component for entity resolution, network link prediction, protein-protein interaction prediction, and so on. Although deep neural networks (DNNs) outperform other methods in many tasks and have thus attracted the attention of machine learning researchers, there have been f… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 11 publications
(12 citation statements)
references
References 14 publications
(17 reference statements)
0
11
0
Order By: Relevance
“…It is to find a general linear subspace in which the cumulative pairwise canonical correlation between every pair of feature sets is maximized after the dimension normalization and subspace projection They used multiple feature fusion using subspace learning for face recognition. Atarashi et al [42] used conjunction of features for pairwise classifiers across instances. They then applied the method using support vector machine and simple DNN.…”
Section: B Fusion Of Block Featuresmentioning
confidence: 99%
See 1 more Smart Citation
“…It is to find a general linear subspace in which the cumulative pairwise canonical correlation between every pair of feature sets is maximized after the dimension normalization and subspace projection They used multiple feature fusion using subspace learning for face recognition. Atarashi et al [42] used conjunction of features for pairwise classifiers across instances. They then applied the method using support vector machine and simple DNN.…”
Section: B Fusion Of Block Featuresmentioning
confidence: 99%
“…Fu et al [41] Face recognition Multiple Feature Fusion. Atarashi et al [42] Entity resolution and symmetry Linear, multiplicative, and distance combinations. Lu et at.…”
Section: Volume 4 2016mentioning
confidence: 99%
“…While the DNN can extract useful feature representation, introducing the layer using feature combinations explicitly improves the performance of the DNN. Indeed, the DNN-based models using feature combinations outperformed simple DNNs in some applications [23], [24], [26], [27]. We also propose a higher-order pairwise deep neural network (HOPairDNN) by defining z (m) s = m t=2 P t (u (t) s , u (t) s ), (a, b) for s ∈ [k] in Eq.…”
Section: Higher-order Pairwise Deep Neural Networkmentioning
confidence: 99%
“…• DBLP. For the author disambiguation task, we extracted 3,384 papers in which there were 729 unique author names from the DBLP dataset [23]. Each paper was considered an object.…”
Section: Datasetsmentioning
confidence: 99%
See 1 more Smart Citation