There has been growing interest in high-order tensor methods for nonconvex optimization, with adaptive regularization, as they possess better/optimal worst-case evaluation complexity globally and faster convergence asymptotically. These algorithms crucially rely on repeatedly minimizing nonconvex multivariate Taylor-based polynomial sub-problems, at least locally. Finding efficient techniques for the solution of these sub-problems, beyond the second-order case, has been an open question. This paper proposes a second-order method, Quadratic Quartic Regularisation (QQR), for efficiently minimizing nonconvex quartically-regularized cubic polynomials, such as the ARp sub-problem (Birgin et al. Math Program 163:359–368, 2017) with p=3$$p=3$$. Inspired by Nesterov (Quartic regularity, 2022), QQR approximates the third-order tensor term by a linear combination of quadratic and quartic terms, yielding (possibly nonconvex) local models that are solvable to global optimality. In order to achieve accuracy ϵ$$\epsilon $$ in the first-order criticality of the sub-problem in finitely many iterations, we show that the error in the QQR method decreases either linearly or by at least O(ϵ4/3)$$\mathcal {O}(\epsilon ^{4/3})$$ for locally convex iterations, while in the nonconvex case, by at least O(ϵ)$$\mathcal {O}(\epsilon )$$; thus improving, on these types of iterations, the general cubic-regularization bound. Preliminary numerical experiments indicate that two QQR variants perform competitively with state-of-the-art approaches such as ARC (also known as ARp with p=2$$p=2$$), achieving either lower objective value or iteration counts.
4901 Applied Mathematics
,4903 Numerical and Computational Mathematics
,49 Mathematical Sciences