2013
DOI: 10.1007/978-3-642-42042-9_44
|View full text |Cite
|
Sign up to set email alerts
|

Using Similarity between Paired Instances to Improve Multiple-Instance Learning via Embedded Instance Selection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
4
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(4 citation statements)
references
References 8 publications
0
4
0
Order By: Relevance
“…Several regular machine learning algorithms were adapted to process raw multi‐instance data: maximum likelihood‐based methods, 56–59 decision rules and tree‐based methods, 60–63 SVM‐based methods, 48 and evolutionary‐based methods 64 …”
Section: Multi‐instance Learning Algorithmsmentioning
confidence: 99%
See 2 more Smart Citations
“…Several regular machine learning algorithms were adapted to process raw multi‐instance data: maximum likelihood‐based methods, 56–59 decision rules and tree‐based methods, 60–63 SVM‐based methods, 48 and evolutionary‐based methods 64 …”
Section: Multi‐instance Learning Algorithmsmentioning
confidence: 99%
“…If any instance of a given bag is closer to the prototype instance than a threshold, the bag is classified as positive. Expectation–Maximization Diverse Density (EM‐DD) 57 uses the EM algorithm to locate the prototype instances more efficiently. There exist several other MI algorithms based on the Diverse Density approach, such as DD‐SVM 65 and MILES 66 …”
Section: Multi‐instance Learning Algorithmsmentioning
confidence: 99%
See 1 more Smart Citation
“…Diverse Density (DD) seeks to get the concept point t in the determined feature space (Zhang and Goldman 2001). The concept point t is required to be close to at least one instance from each positive bag and meanwhile it should be far away from instances in negative bags.…”
Section: Em-ddmentioning
confidence: 99%