1998
DOI: 10.1017/s0263574798001064
|View full text |Cite
|
Sign up to set email alerts
|

Manipulator Inverse Kinematics using an Adaptive Back-propagation Algorithm and Radial Basis Function with a Lookup Table

Abstract: This is an extension of previous work which used an artificial neural network with a back-propagation algorithm and a lookup table to find the inverse kinematics for a manipulator arm moving along pre-defined trajectories. The work now described shows that the performance of this technique can be improved if the back-propagation is made to be adaptive. Also, further improvement is obtained by using the whole workspace to train the neural network rather than just a pre-defined path. For the inverse kine… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2000
2000
2014
2014

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 3 publications
0
2
0
Order By: Relevance
“…There have been many solutions proposed using neural networks to solve the inverse kinematics problem of an unknown geometry manipulator, such as multi-layer perceptron (MLPN) (Watanabe and Shimizu, 1991;Guez and Ahmad, 1998;Choi and Lawrence, 1992;Binggul et al, 2005;Morris and Mansor, 1997;Guez and Ahmad, 1989;Takanashi, 1990;Alsina et al, 1995;Lui and Ito, 1995), self-organised network systems (Zeller and Schulten, 1996;Barhen et al, 1989;Herman et al, 2003) and radial basis function networks (RBFNs) (Driscoll, 2000;Zhang et al, 2004;Yang et al, 2000;Morris and Mansor, 1998;Mayorga and Sanongboon, 2002). The MLPN is the most popular neural network applied to functional approximation problems.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…There have been many solutions proposed using neural networks to solve the inverse kinematics problem of an unknown geometry manipulator, such as multi-layer perceptron (MLPN) (Watanabe and Shimizu, 1991;Guez and Ahmad, 1998;Choi and Lawrence, 1992;Binggul et al, 2005;Morris and Mansor, 1997;Guez and Ahmad, 1989;Takanashi, 1990;Alsina et al, 1995;Lui and Ito, 1995), self-organised network systems (Zeller and Schulten, 1996;Barhen et al, 1989;Herman et al, 2003) and radial basis function networks (RBFNs) (Driscoll, 2000;Zhang et al, 2004;Yang et al, 2000;Morris and Mansor, 1998;Mayorga and Sanongboon, 2002). The MLPN is the most popular neural network applied to functional approximation problems.…”
Section: Introductionmentioning
confidence: 99%
“…On the other hand, RBFNs which are conceptually simpler and possess the ability to model any non-linear function conveniently have become an alternative to MLPNs (Sun and Zhu, 2012). There have been several approaches using the RBFN to compare with the performance of the MLPN in the inverse kinematics problem (Driscoll, 2000;Zhang et al, 2004;Yang et al, 2000;Morris and Mansor, 1998;Mayorga and Sanongboon, 2002). However, all the previously mentioned approaches tried to produce an inverse solution of the forward kinematics transformation to build the mapping from world coordinate space to joint angle space.…”
Section: Introductionmentioning
confidence: 99%