The numerical methods employed in the solution of many scientific computing problems require the computation of derivatives of a function I : R" ~ Rm. Both the accuracy and the computational requirements of the derivative computation are usually of critical importance for the robustness and speed of the numerical solution. Automatic Differentiation of FORtran (ADIFOR) is a source transformation tool that accepts Fortran 77 code for the computation of a function and writes portable Fortran 77 code for the computation of the derivatives. In contrast to previous approaches, ADIFOR views automatic differentiation as a source transformation problem. ADIFOR employs the data analysis capabilities of the ParaScope Parallel Programming Environment, which enable us to handle arbitrary Fortran 77 codes and to exploit the computational context in the computation of derivatives. Experimental results show that ADIFOR can handle real-life codes and that ADIFOR-generated codes are competitive with divided-difference approximations of derivatives. In addition, studies suggest that the source transformation approach to automatic differentiation may improve the time to compute derivatives by orders of magnitude.
The computation of large sparse Jacobian matrices is required in many important large-scale scienti c problems. We consider three approaches to computing such matrices: hand-coding, di erence approximations, and automatic di erentiation using the ADIFOR (Automatic Di erentiation in Fortran) tool. We compare the numerical reliability and computational e ciency of these approaches on applications from the MINPACK-2 test problem collection. Our conclusion is that automatic di erentiation is the method of choice, leading to results that are as accurate as hand-coded derivatives, while at the same time outperforming di erence approximations in both accuracy and speed.
Dedicated to Professor C.G. Broyden on the occasion of his 60ttWhen nonlinear equation solvers are applied to parameter-dependent problems, th interpreted as functions of these variable parameters. The derivatives (if they exist functions can be recursively evaluated by the forward mode of automatic differentiation whether and how fast these derivative &dues converge to the derivative of the implici~ nce Publishers, S.A. Printed in Malaysia r iterates can be sf these iterated 'hen one may ask olution function, which may be needed for parameter identification, sensitivity studies, or design op;imi&ition.It is shown here that derivative convergence is achieved with an R-linear or possibly R-superlinear rate for a large class of memoryless contractions or secant updating methods. For a wide' class of multistep contractions, we obtain R-linear convergence of a simplified derivative recurrence, wh$h is more economical and can be easily generalized to higher-order derivatives. We also formulate a c nstructive criterion for derivative convergence based on the implicit function theorem. All theoretical r e s j t i are confirmed by numerical experiments on small test examples.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.