2015
DOI: 10.1007/s11859-015-1094-9
|View full text |Cite
|
Sign up to set email alerts
|

An approximate linear solver in least square support vector machine using randomized singular value decomposition

Abstract: In this paper, we investigate the linear solver in least square support vector machine (LSSVM) for large-scale data regression. The traditional methods using the direct solvers are costly. We know that the linear equations should be solved repeatedly for choosing appropriate parameters in LSSVM, so the key for speeding up LSSVM is to improve the method of solving the linear equations. We approximate large-scale kernel matrices and get the approximate solution of linear equations by using randomized singular va… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 16 publications
0
3
0
Order By: Relevance
“…4). The reason may be the hydroxyl groups of starch created hydrogen bonds and improved the hydrophilic ability of the polymer [13][14]. When the starch was less, the macromolecular network skeleton cannot be formed well and the hydroscopicity of the polymer will be poor, the starch was more, the hydrogen bond can be created between starch molecules.…”
Section: Results and Analysismentioning
confidence: 99%
“…4). The reason may be the hydroxyl groups of starch created hydrogen bonds and improved the hydrophilic ability of the polymer [13][14]. When the starch was less, the macromolecular network skeleton cannot be formed well and the hydroscopicity of the polymer will be poor, the starch was more, the hydrogen bond can be created between starch molecules.…”
Section: Results and Analysismentioning
confidence: 99%
“…The disadvantage of this method is that the model training is performed twice, the solution process is complicated and time-consuming. Subsequently, an LSSVM algorithm with a fixed size sample set and a corresponding improved algorithm [3,4], and the method for combining LSSVM with other machine learning algorithms, have appeared [10,13]. The core concept of these algorithms is to compress large datasets into smaller sub-datasets, and then train them in the LSSVM model [15,16].…”
Section: Introductionmentioning
confidence: 99%
“…Suykens et al . have made substantial efforts using large‐scale data sets; these authors have developed many excellent algorithms [3–8] and combined the LSSVM with other algorithms for machine learning [9–11].…”
Section: Introductionmentioning
confidence: 99%
“…In the last ten years, in order to solve the sparse solution problem, a large number of scholars have conducted related research and proposed many new algorithms and improved algorithms to solve the training sample set sparsity problem. Suykens et al have made substantial efforts using large-scale data sets; these authors have developed many excellent algorithms [3][4][5][6][7][8] and combined the LSSVM with other algorithms for machine learning [9][10][11].…”
mentioning
confidence: 99%