Title (deu)
Coordinate descent methods beyond separability and smoothness
Speaker / Lecturer
Ion Necoara
U Politehnica Bucharest
Description (deu)
In this talk we consider optimization problems, possibly nonseparable and nonsmooth. When the objective function is composite we design random coordinate proximal gradient methods which take into account the nonseparable form of the objective. When the objective function is nonsmooth we introduce a general smooth approximation framework for the original function, which covers the most important classes of smoothing techniques from the literature, and then apply random (accelerated) coordinate descent methods for minimizing the corresponding smooth approximations. Our algorithms achieve scalability by constructing at each iteration a local approximation model of the whole nonseparable objective function along a random subspace with user-determined dimension. We present a complete worst-case complexity analysis for our random coordinate (proximal) gradient methods in both convex and nonconvex settings. The numerical results on applications ranging from smallest eigenvalue problem to matrix factorization and support vector machine classification also confirm the efficiency of our algorithms.
Keywords (deu)
One World Optimization Seminar
Subject (eng)
ÖFOS 2012 -- 101 -- Mathematics
Type (eng)
Language
English [eng]
Persistent identifier
Project
Title (deu)
One World Optimization Seminar in Vienna
Project description (eng)
Workshop
Start date
2024-06-03
End date
2024-06-07
Project homepage
Date created
2024-06-07
Place of creation (eng)
ESI
Duration
30 minutes 04 seconds
License
- Citable links
Persistent identifier
https://phaidra.univie.ac.at/o:2070057 - Content
- Details
- Metadata
- Export formats
Media Package Identifier
id=3da89f48-ef4d-4423-905e-03413405d312