2019
DOI: 10.1109/tpds.2018.2877359
|View full text |Cite
|
Sign up to set email alerts
|

A Bi-layered Parallel Training Architecture for Large-Scale Convolutional Neural Networks

Abstract: Benefitting from large-scale training datasets and the complex training network, Convolutional Neural Networks (CNNs) are widely applied in various fields with high accuracy. However, the training process of CNNs is very time-consuming, where large amounts of training samples and iterative operations are required to obtain high-quality weight parameters. In this paper, we focus on the time-consuming training process of large-scale CNNs and propose a Bi-layered Parallel Training (BPT-CNN) architecture in distri… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
65
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
8
1

Relationship

1
8

Authors

Journals

citations
Cited by 152 publications
(65 citation statements)
references
References 20 publications
(28 reference statements)
0
65
0
Order By: Relevance
“…Various distributed AI and DL algorithms were proposed in distributed computing, cloud computing, fog computing, and edge computing environments to improve their performance and scalability [15][16][17][18][19][20]. In our previous work, we proposed a two-layer parallel CNN training architecture in a distributed computing cluster [15]. Li et al discussed the application of Machine Learning (ML) in smart industry and introduced an efficient manufacture inspection system using fog computing [18].…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Various distributed AI and DL algorithms were proposed in distributed computing, cloud computing, fog computing, and edge computing environments to improve their performance and scalability [15][16][17][18][19][20]. In our previous work, we proposed a two-layer parallel CNN training architecture in a distributed computing cluster [15]. Li et al discussed the application of Machine Learning (ML) in smart industry and introduced an efficient manufacture inspection system using fog computing [18].…”
Section: Related Workmentioning
confidence: 99%
“…Focusing on distributed VS systems and AI algorithms, most current VS systems rely on traditional centralized or cloud-based solutions, facing huge data communication overhead, high latency, and severe packet loss limitations [3,12]. Existing studies have proposed various distributed AI and Deep Learning (DL) algorithms in distributed computing clusters and cloud computing platforms, such as distributed CNN, DNN, and LSTM [9,15,16]. There are many exploration spaces for distributed AI algorithms and VS system in EC environments [4,9,17].…”
Section: Introductionmentioning
confidence: 99%
“…While the analysis performed provided valuable insight, the proposed algorithm required a lot of memory to store records. There has been a lot of research works on deep learning, for example, a Bi-layered Parallel Training Convolutional Neural Networks (BPT-CNN) has been proposed in a distributed computing environment [36], [37]. The work in [38] proposed a deep learning scheme for intelligent video surveillance systems with edge computing.…”
Section: B Learning-based Physical Layer Authenticationmentioning
confidence: 99%
“…Thus, in our article, we aim to incorporate the attention mechanism into the bi-directional recurrent Bayesian topic model for documents. For training models on large-scale data, various methods such as [13,14,30,39,48,60,76] are investigated. We also propose an online learning algorithm to handle large-scale document streams.…”
Section: Related Workmentioning
confidence: 99%