A family of second-order methods for convex -regularized optimization

Küçük Resim Yok

Tarih

2016

Dergi Başlığı

Dergi ISSN

Cilt Başlığı

Yayıncı

Springer Heidelberg

Erişim Hakkı

info:eu-repo/semantics/closedAccess

Özet

This paper is concerned with the minimization of an objective that is the sum of a convex function f and an regularization term. Our interest is in active-set methods that incorporate second-order information about the function f to accelerate convergence. We describe a semismooth Newton framework that can be used to generate a variety of second-order methods, including block active set methods, orthant-based methods and a second-order iterative soft-thresholding method. The paper proposes a new active set method that performs multiple changes in the active manifold estimate at every iteration, and employs a mechanism for correcting these estimates, when needed. This corrective mechanism is also evaluated in an orthant-based method. Numerical tests comparing the performance of three active set methods are presented.

Açıklama

Anahtar Kelimeler

Thresholding Algorithm, Newton, Shrinkage, Strategy, Online

Kaynak

Mathematical Programming

WoS Q Değeri

Q1

Scopus Q Değeri

Q1

Cilt

159

Sayı

1.Şub

Künye