2013
DOI: 10.1007/s11280-013-0236-2
|View full text |Cite
|
Sign up to set email alerts
|

ELM ∗ : distributed extreme learning machine with MapReduce

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
35
0
1

Year Published

2015
2015
2020
2020

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 53 publications
(36 citation statements)
references
References 29 publications
0
35
0
1
Order By: Relevance
“…Thus, we can use MapReduce framework to speedup the computation of output weight vector β. The process of calculating matrices U and V based on MapReduce framework is shown in Algorithm 1 [28]. In the Initialize method of Mapper, we initialize two arrays, u and v, which are used to store the intermediate summation of the elements in matrices U and V respectively.…”
Section: Elmmentioning
confidence: 99%
See 4 more Smart Citations
“…Thus, we can use MapReduce framework to speedup the computation of output weight vector β. The process of calculating matrices U and V based on MapReduce framework is shown in Algorithm 1 [28]. In the Initialize method of Mapper, we initialize two arrays, u and v, which are used to store the intermediate summation of the elements in matrices U and V respectively.…”
Section: Elmmentioning
confidence: 99%
“…As a variant of ELM, distributed ELM (i.e. PELM [27] and ELM n [28]) based on MapReduce [29][30][31] can resolve the V1 (Volume) problem of Big Data. However, it is quite common in big data classifications that some new training data arrived, some old training data expired and some error training data corrected.…”
Section: Introductionmentioning
confidence: 99%
See 3 more Smart Citations