2019
DOI: 10.1109/access.2019.2918268
|View full text |Cite
|
Sign up to set email alerts
|

Automated Neural-Based Modeling of Microwave Devices Using Parallel Computation and Interpolation Approaches

Abstract: Automated model generation (AMG) is an automated artificial neural network (ANN) modeling algorithm, which integrates all the subtasks (including adaptive sampling/data generation, model structure adaptation, training, and testing) in neural model development into one unified framework. In existing AMG, most of the time is spent on data sampling and model structure adaptation due to the iterative neural network training and the sequential computation mechanism. In this paper, we propose an advanced AMG algorit… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(4 citation statements)
references
References 27 publications
(43 reference statements)
0
4
0
Order By: Relevance
“…); 2) Which strategy to consider for model creation that exploits knowledge of passive device performances (e.g., which should be the inputs/outputs of the model). Regarding the first choice, there are several ML techniques proposed in the literature and applied to passive component modeling [8], [9], [10], [11], [12], [13], [14], [15], [16], [17], [18], [19], [20], [21], [22], [23], [24], [25], [26], [27], [28]. The following ones are considered in this work: Gaussianprocess regression (GPR), kernel ridge regression (KRR), random forest regression (RFR), radial basis function (RBF), nearest neighbor (NN), and ANNs.…”
Section: Surrogate Modelsmentioning
confidence: 99%
See 1 more Smart Citation
“…); 2) Which strategy to consider for model creation that exploits knowledge of passive device performances (e.g., which should be the inputs/outputs of the model). Regarding the first choice, there are several ML techniques proposed in the literature and applied to passive component modeling [8], [9], [10], [11], [12], [13], [14], [15], [16], [17], [18], [19], [20], [21], [22], [23], [24], [25], [26], [27], [28]. The following ones are considered in this work: Gaussianprocess regression (GPR), kernel ridge regression (KRR), random forest regression (RFR), radial basis function (RBF), nearest neighbor (NN), and ANNs.…”
Section: Surrogate Modelsmentioning
confidence: 99%
“…In the past two decades, surrogate modeling, a kind of supervised machine learning, has emerged as a possible solution to the problem of how to accurately evaluate passive components in substantially shorter execution times than EM simulations [8], [9], [10], [11], [12], [13], [14], [15], [16], [17], [18], [19], [20], [21], [22], [23], [24], [25], [26], [27]. Moreover, a very extensive survey of recent advances in the area is available in [28].…”
Section: Introductionmentioning
confidence: 99%
“…Recent research efforts of ANN-based EM parametric modeling techniques have focused on automated model generation (AMG) methods [13], [51], [65], hybrid training methods incorporating parallel processing [34], and multiphysics parametric modeling [66], [67]. In [51], an advanced algorithm for AMG using neural networks has been presented, where interpolation techniques are incorporated to avoid redundant training in AMG, accelerating the overall model generation process.…”
Section: Thesis Organizationmentioning
confidence: 99%
“…As an extension of the work in [51], an enhanced AMG algorithm has been presented in [13] to automate the development process of knowledge-based neural network models for microwave applications. As a further advance, in [65], the parallel computation method has been incorporated into the AMG algorithm to achieve an additional speedup for neural modeling of microwave devices. In [34], a global neural network training method that combines hybrid training algorithm with parallel processing has been provided, where multiple neural network trainings are distributed to different processors and local search is performed in parallel to increase the probability and speed of finding a global optimum.…”
Section: Thesis Organizationmentioning
confidence: 99%