“…If B k = 1/ᾱ BB k I is chosen, whereᾱ BB k is some modified BB stepsize, then the α AOS k reduces toᾱ BB k , and the corresponding method is some modified BB method [4,7,30]. And if B k = 1/tI, t > 0, then the α AOS k is the fixed stepsize t, and the corresponding method is the gradient method with fixed stepsize [16,22,33]. In this sense, the approximate optimal gradient method is a generalisation of the BB methods.…”