2020
DOI: 10.1109/access.2020.2966661
|View full text |Cite
|
Sign up to set email alerts
|

A Modified Conjugate Gradient Approach for Reliability-Based Design Optimization

Abstract: To improve the efficiency of structural reliability-based design optimization (RBDO) based on the performance measure approach (PMA), a modified conjugate gradient approach (MCGA) is proposed for RBDO with nonlinear performance function. In PMA, the advanced mean value (AMV) approach is widely used in engineering because its simplicity and efficiency. However, the AMV method shows the inefficient and unstable results for structural performance function with high nonlinearity in RBDO. To overcome this shortcomi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
5
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(5 citation statements)
references
References 32 publications
(29 reference statements)
0
5
0
Order By: Relevance
“…In a subsequent work, Frady et al proposed a method for gradient-based learning in HDC that utilized iterative projections and local linearizations to facilitate learning in high-dimensional spaces [34]. Building upon these developments, Wang et al presented a gradient-based HDC algorithm for clustering and classification tasks, which employed a convex optimization formulation to enhance HDC's performance in these applications [35]. Moreover, Su et al developed a gradient-based HDC algorithm for deep learning, illustrating the potential of gradient-based methods in improving the robustness and expressiveness of HDC models [36].…”
Section: B Gradient Based Hdc Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…In a subsequent work, Frady et al proposed a method for gradient-based learning in HDC that utilized iterative projections and local linearizations to facilitate learning in high-dimensional spaces [34]. Building upon these developments, Wang et al presented a gradient-based HDC algorithm for clustering and classification tasks, which employed a convex optimization formulation to enhance HDC's performance in these applications [35]. Moreover, Su et al developed a gradient-based HDC algorithm for deep learning, illustrating the potential of gradient-based methods in improving the robustness and expressiveness of HDC models [36].…”
Section: B Gradient Based Hdc Methodsmentioning
confidence: 99%
“…For example, Frady and Somer's work involves iterative projections and local linearizations which can be expensive. Similarly Wang's convex optimization formulation for clustering and classification tasks can result in increased computational overhead [35] Prior works have also explored the use Holographic Reduced Representations (HRR), a family models for gradient based learning tasks. A notable attempt to capitalize on the symbolic properties of HRR was made by Nickel et al [38], who utilized binding operations to link elements within a knowledge graph.…”
Section: B Gradient Based Hdc Methodsmentioning
confidence: 99%
“…RBDO may utilize a number of different techniques: a gradient approach to increase the effectiveness of RBDO in extreme nonlinearities [38], sequential optimization to solve for quadratic models [39], [40], and single loop approaches to assist in long-term time-based optimization problems [41], [42]. The single loop approach focuses on approximating the reliability constraint of a system to estimate how long the model will remain efficient or run without failure [43].…”
Section: List Of Figuresmentioning
confidence: 99%
“…With the development of mechanical systems towards high precision and reliability, it is essential to consider uncertainties in engineering design. Reliability-based design optimization (RBDO) is one of the most powerful design methods considering uncertainties, which seeks an optimum design that satisfies probability constraints [1]. Uncertainties can be generally classified into aleatory and epistemic uncertainties [2].…”
Section: Introductionmentioning
confidence: 99%