2016
DOI: 10.1098/rspa.2015.0551
|View full text |Cite
|
Sign up to set email alerts
|

Maximum margin classifier working in a set of strings

Abstract: Numbers and numerical vectors account for a large portion of data. However, recently, the amount of string data generated has increased dramatically. Consequently, classifying string data is a common problem in many fields. The most widely used approach to this problem is to convert strings into numerical vectors using string kernels and subsequently apply a support vector machine that works in a numerical vector space. However, this nonone-to-one conversion involves a loss of information and makes it impossib… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
8
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
2
2

Relationship

2
2

Authors

Journals

citations
Cited by 4 publications
(8 citation statements)
references
References 45 publications
0
8
0
Order By: Relevance
“…In this study, using the probability theory on A * developed in [19][20][21], we constructed the theory of the mixture model and the EM algorithm on A * and derived the optimal procedure for unsupervised string clustering based on the theory. We encountered the interesting phenomenon that an EM algorithm for the Laplace-like mixture on A * , which we sought, could not be written in an explicit manner because of the complex metric structure of A * , but a sequence of algorithms (i.e., a sequence of sequences of computations) that strongly consistently estimates the parameters of the Laplace-like mixture and converges to the EM algorithm with probability one was obtained explicitly.…”
Section: Discussionmentioning
confidence: 99%
See 4 more Smart Citations
“…In this study, using the probability theory on A * developed in [19][20][21], we constructed the theory of the mixture model and the EM algorithm on A * and derived the optimal procedure for unsupervised string clustering based on the theory. We encountered the interesting phenomenon that an EM algorithm for the Laplace-like mixture on A * , which we sought, could not be written in an explicit manner because of the complex metric structure of A * , but a sequence of algorithms (i.e., a sequence of sequences of computations) that strongly consistently estimates the parameters of the Laplace-like mixture and converges to the EM algorithm with probability one was obtained explicitly.…”
Section: Discussionmentioning
confidence: 99%
“…This is different from the phenomena that an algorithm halts and that a sequence of approximate solutions from an iterative algorithm converges. The authors addressed the problem of supervised string classification by constructing a theory of a statistical learning machine that works in A * in [19]. Recently, the amount of string data has increased exponentially.…”
Section: Discussionmentioning
confidence: 99%
See 3 more Smart Citations