讲座标题：A Family of Spectral Gradient Methods for Optimization
主讲人： 戴彧虹 教授中国科学院数学与系统科学研究院
We propose a family of spectral gradient methods, whose stepsize is determined by a convex combination of the short Barzilai-Borwein (BB) stepsize and the long BB stepsize. It is shown that each member of the family shares certain quasi-Newton property in the sense of least squares. The family also includes some other gradient methods as its special cases. We prove that the family of methods is R-superlinearly convergent for two-dimensional strictly convex quadratics. Moreover, the family is R-linearly convergent in the n-dimensional case. Numerical results of the family with di_erent settings are presented, which demonstrate that the proposed family is promising. This is a joint work with Yakui Huang and Xinwei.