Search Results - "Richtárik, Peter"
-
1
Momentum and stochastic momentum for stochastic gradient, Newton, proximal point and subspace descent methods
Published in Computational optimization and applications (01-12-2020)“…In this paper we study several classes of stochastic optimization algorithms enriched with heavy ball momentum . Among the methods studied are: stochastic…”
Get full text
Journal Article -
2
Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
Published in Mathematical programming (01-04-2014)“…In this paper we develop a randomized block-coordinate descent method for minimizing the sum of a smooth and a simple nonsmooth block-separable convex function…”
Get full text
Journal Article -
3
Revisiting Randomized Gossip Algorithms: General Framework, Convergence Rates and Novel Block and Accelerated Protocols
Published in IEEE transactions on information theory (01-12-2021)“…In this work we present a new framework for the analysis and design of randomized gossip algorithms for solving the average consensus problem. We show how…”
Get full text
Journal Article -
4
Coordinate descent with arbitrary sampling I: algorithms and complexity
Published in Optimization methods & software (02-09-2016)“…We study the problem of minimizing the sum of a smooth convex function and a convex block-separable regularizer and propose a new randomized coordinate descent…”
Get full text
Journal Article -
5
Parallel coordinate descent methods for big data optimization
Published in Mathematical programming (01-03-2016)“…In this work we show that randomized (block) coordinate descent methods can be accelerated by parallelization when applied to the problem of minimizing the sum…”
Get full text
Journal Article -
6
A Randomized Exchange Algorithm for Computing Optimal Approximate Designs of Experiments
Published in Journal of the American Statistical Association (02-01-2020)“…We propose a class of subspace ascent methods for computing optimal approximate designs that covers existing algorithms as well as new and more efficient ones…”
Get full text
Journal Article -
7
Variance-Reduced Methods for Machine Learning
Published in Proceedings of the IEEE (01-11-2020)“…Stochastic optimization lies at the heart of machine learning, and its cornerstone is stochastic gradient descent ( SGD ), a method introduced over 60 years…”
Get full text
Journal Article -
8
Coordinate descent with arbitrary sampling II: expected separable overapproximation
Published in Optimization methods & software (02-09-2016)“…The design and complexity analysis of randomized coordinate descent methods, and in particular of variants which update a random subset (sampling) of…”
Get full text
Journal Article -
9
Dualize, Split, Randomize: Toward Fast Nonsmooth Optimization Algorithms
Published in Journal of optimization theory and applications (01-10-2022)“…We consider minimizing the sum of three convex functions, where the first one F is smooth, the second one is nonsmooth and proximable and the third one is the…”
Get full text
Journal Article -
10
Randomized Distributed Mean Estimation: Accuracy vs. Communication
Published in Frontiers in applied mathematics and statistics (18-12-2018)“…We consider the problem of estimating the arithmetic average of a finite collection of real vectors stored in a distributed fashion across several compute…”
Get full text
Journal Article -
11
Fastest rates for stochastic mirror descent methods
Published in Computational optimization and applications (01-07-2021)“…Relative smoothness—a notion introduced in Birnbaum et al. (Proceedings of the 12th ACM conference on electronic commerce, ACM, pp 127–136, 2011) and recently…”
Get full text
Journal Article -
12
Stochastic quasi-gradient methods: variance reduction via Jacobian sketching
Published in Mathematical programming (2021)“…We develop a new family of variance reduced stochastic gradient descent methods for minimizing the average of a very large number of smooth functions. Our…”
Get full text
Journal Article -
13
On optimal probabilities in stochastic coordinate descent methods
Published in Optimization letters (01-08-2016)“…We propose and analyze a new parallel coordinate descent method—NSync—in which at each iteration a random subset of coordinates is updated, in parallel,…”
Get full text
Journal Article -
14
Inexact Coordinate Descent: Complexity and Preconditioning
Published in Journal of optimization theory and applications (01-07-2016)“…One of the key steps at each iteration of a randomized block coordinate descent method consists in determining the update to a block of variables. Existing…”
Get full text
Journal Article -
15
Optimization in High Dimensions via Accelerated, Parallel, and Proximal Coordinate Descent
Published in SIAM review (01-01-2016)“…We propose a new randomized coordinate descent method for minimizing the sum of convex functions, each of which depends on a small number of coordinates only…”
Get full text
Journal Article -
16
Stochastic distributed learning with gradient quantization and double-variance reduction
Published in Optimization methods & software (02-01-2023)“…We consider distributed optimization over several devices, each sending incremental model updates to a central server. This setting is considered, for…”
Get full text
Journal Article -
17
Best Pair Formulation & Accelerated Scheme for Non-Convex Principal Component Pursuit
Published in IEEE transactions on signal processing (2020)“…Given two disjoint sets, the best pair problem aims to find a point in one set and another point in the other set with minimal distance between them. In this…”
Get full text
Journal Article -
18
Unified Analysis of Stochastic Gradient Methods for Composite Convex and Smooth Optimization
Published in Journal of optimization theory and applications (01-11-2023)“…We present a unified theorem for the convergence analysis of stochastic gradient algorithms for minimizing a smooth and convex loss plus a convex regularizer…”
Get full text
Journal Article -
19
Semi-Stochastic Gradient Descent Methods
Published in Frontiers in applied mathematics and statistics (23-05-2017)“…In this paper we study the problem of minimizing the average of a large number of smooth convex loss functions. We propose a new method, S2GD (Semi-Stochastic…”
Get full text
Journal Article -
20
Accelerated Bregman proximal gradient methods for relatively smooth convex optimization
Published in Computational optimization and applications (01-06-2021)“…We consider the problem of minimizing the sum of two convex functions: one is differentiable and relatively smooth with respect to a reference convex function,…”
Get full text
Journal Article