Gradient‐based optimization algorithms require functions values and gradients of objective function and constraints. When an optimization problem has many design variables, calculating gradients is the most time‐consuming part in gradient‐based optimization algorithms. Since the gradient is the rate of change of a function with respect to change in design variables, the method of calculating gradients is often called sensitivity analysis. In this chapter, a brief overview of the most popular sensitivity analysis methods are presented: global finite difference method, discrete method, continuum method, and automatic differentiation method. The performance of each sensitivity analysis method is discussed in terms of accuracy, efficiency, and difficulty in implementation. The global finite difference method is the easiest to implement, but is expensive and its accuracy depends on the perturbation size. The discrete and continuum sensitivities are accurate and efficient, but they require a certain level of effort in implementation. The automatic differentiation method is accurate but requires a huge amount of initial effort in implementation. Several numerical examples are provided using analytical and numerical methods of calculating sensitivity information.