2008
DOI: 10.1016/j.jmatprotec.2007.09.085
|View full text |Cite
|
Sign up to set email alerts
|

Prediction of martensite and austenite start temperatures of the Fe-based shape memory alloys by artificial neural networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
13
0

Year Published

2009
2009
2018
2018

Publication Types

Select...
9
1

Relationship

0
10

Authors

Journals

citations
Cited by 34 publications
(13 citation statements)
references
References 32 publications
(27 reference statements)
0
13
0
Order By: Relevance
“…To predict the phase transformation behavior and to design shape-memory property, the optimum concentrations of the alloy components can be calculated with various equations for the Gibbs free energy difference between the £-and ¾-phases, 2831) and various empirical equations for M s temperature of the £ ¼ ¾ transformation. 32,33) On the other hand, the roles of silicon are diversified and complicated. Silicon hardens the parent matrix to suppress the dislocation gliding, while it promotes the £ ¼ ¾ martensitic transformation through lowering the stacking fault energy.…”
Section: Functions Of Elementsmentioning
confidence: 99%
“…To predict the phase transformation behavior and to design shape-memory property, the optimum concentrations of the alloy components can be calculated with various equations for the Gibbs free energy difference between the £-and ¾-phases, 2831) and various empirical equations for M s temperature of the £ ¼ ¾ transformation. 32,33) On the other hand, the roles of silicon are diversified and complicated. Silicon hardens the parent matrix to suppress the dislocation gliding, while it promotes the £ ¼ ¾ martensitic transformation through lowering the stacking fault energy.…”
Section: Functions Of Elementsmentioning
confidence: 99%
“…Each ANN must have three layers: input layer, hidden layer and output layer. The greatest number of ANNs has one hidden layer though there may be more of them [12,13,14].…”
Section: Artificial Neural Network (Ann) 31 Overview Of Annmentioning
confidence: 99%
“…It uses a hyperbolic tangent sigmoid transfer function. The summation u i is transferred using a scalar-to-scalar function called an ''activation or transfer function", f(u i ), to yield a value called the unit's ''activation", given as: The most commonly used training algorithm for the multi-layer perception is a back propagation algorithm (BPA) [73]. It is a gradient descent method to minimize the error for a particular training pattern.…”
Section: Artificial Neural Network (Ann)mentioning
confidence: 99%